<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>ElcomSoft blog</title>
	<atom:link href="https://blog.elcomsoft.com/feed/" rel="self" type="application/rss+xml" />
	<link>https://blog.elcomsoft.com</link>
	<description>«...Everything you wanted to know about password recovery, data decryption, mobile &#38; cloud forensics...»</description>
	<lastBuildDate>Thu, 02 Apr 2026 19:14:01 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Compelled Decryption: The East Asian Region</title>
		<link>https://blog.elcomsoft.com/2026/04/compelled-decryption-the-east-asian-region/</link>
		
		<dc:creator><![CDATA[Oleg Afonin]]></dc:creator>
		<pubDate>Fri, 03 Apr 2026 12:00:54 +0000</pubDate>
				<category><![CDATA[General]]></category>
		<category><![CDATA[law]]></category>
		<guid isPermaLink="false">https://blog.elcomsoft.com/?p=13073</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/04/law-3-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div>This piece marks the third installment in our ongoing series analyzing compelled decryption laws. As digital evidence continues to play a central role in modern investigations, legal systems worldwide are actively addressing the friction between encrypted devices and law enforcement access. For this chapter, our geographic focus shifts to East Asia. The region provides a [&#8230;]</div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/04/law-3-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div><p>This piece marks the third installment in our ongoing series analyzing compelled decryption laws. As digital evidence continues to play a central role in modern investigations, legal systems worldwide are actively addressing the friction between encrypted devices and law enforcement access. For this chapter, our geographic focus shifts to East Asia. The region provides a highly practical comparative landscape for observing how neighboring jurisdictions weigh the technical demands of modern forensics against individual procedural rights. To map these diverse approaches, the following sections review the current legal mechanisms in mainland China, Hong Kong, Taiwan, Japan, and South Korea.</p>
<h2>Mainland China</h2>
<p>Key Points:</p>
<ul>
<li>A legal system that prioritizes state security and investigative access over a broad individual right to refuse cooperation.</li>
<li>The clearest decryption duties fall on network operators, telecommunications companies, and internet service providers.</li>
<li>Cybersecurity and counter-terrorism laws explicitly require these entities to provide technical support, including decryption assistance in some contexts.</li>
<li>The system strongly favors state access, but current statutes do not clearly create a standalone criminal offense for an ordinary suspect who refuses to unlock a personal device.</li>
</ul>
<p>In the People&#8217;s Republic of China, digital investigations operate within a framework that prioritizes state security and investigative access over an individual right to refuse cooperation. Rather than building the justice system around an expansive right to silence, mainland law gives security authorities broad access powers and imposes specific duties of technical assistance on regulated entities, particularly in matters of national security, public safety, and counter-terrorism.</p>
<p>However, the legal mechanics of compelled decryption are more specific than often assumed. While criminal procedure laws impose a general civic duty to report crimes, they do not contain a blanket decryption mandate for ordinary citizens. Instead, the clearest and most forceful digital-access rules target corporate entities.</p>
<p>Under the country&#8217;s cybersecurity and counter-terrorism statutes, the burden of access falls heavily on network operators, telecommunications companies, and internet service providers. These laws legally bind operators to provide technical support to public and state security organs. In terrorism investigations, the mandate goes even further, expressly requiring service providers to supply technical interfaces and decryption assistance, backed by severe corporate penalties for non-compliance.</p>
<p>This creates a distinct legal reality: mainland Chinese law unequivocally compels decryption assistance from regulated technology providers. What it does not clearly do—at least on the face of current statutes—is create a neatly defined, standalone criminal offense that automatically penalizes an ordinary individual simply for refusing to hand over a personal passcode. The system undeniably favors state access, but it anchors its strictest decryption duties and enforceable penalties on the entities operating the networks rather than the individual suspect.</p>
<p>This framework, however, remains fluid. On January 31, 2026, the Ministry of Public Security published a Draft Law on Cybercrime Prevention and Control. While currently only a draft rather than enacted law, it signals an ongoing effort to consolidate and potentially expand the state&#8217;s oversight of digital spaces.</p>
<h2>Hong Kong</h2>
<p>Key Points:</p>
<ul>
<li>The implementation of the National Security Law significantly altered the region&#8217;s legal landscape.</li>
<li>Amendments enacted in March 2026 explicitly empower law enforcement to compel individuals to provide device passwords, specifically within the context of national security investigations.</li>
<li>Authorities are required to obtain legal authorization before compelling decryption, and refusing a lawful request is a criminal offense punishable by fines and up to a year in prison.</li>
</ul>
<p>Once a territory under the British Crown, Hong Kong was transferred back to Chinese sovereignty under the premise of &#8220;one country, two systems.&#8221; This diplomatic arrangement was designed to maintain the region&#8217;s status quo, preserving its independent judiciary, capitalist economy, and common law legal traditions distinct from the mainland. However, observing the current legal environment reveals a gradual shift in that structural separation, particularly in how digital rights and investigative powers are managed.</p>
<p>The implementation of the National Security Law (NSL) served as the primary catalyst for altering the region&#8217;s legal landscape, bridging the gap between Hong Kong&#8217;s common law history and a more centralized, security-oriented approach. Under amendments to the NSL implementation rules enacted in March 2026, law enforcement agencies are now empowered to compel individuals to hand over device passwords and provide decryption assistance. The scope of these provisions is formally restricted to cases involving matters of national security. Furthermore, procedural scrutiny is required; officials have clarified that police cannot randomly demand passcodes from the public. Law enforcement must first secure the appropriate legal authorization or a warrant to search the device before a password demand can be made.</p>
<p>Once that legal authorization is obtained, however, the burden is placed directly on the device owner or user. In these specific national security contexts, refusing to cooperate is not shielded by the traditional right to remain silent. The updated rules classify non-compliance as a criminal offense carrying substantial penalties that include fines of up to HK$100,000 and up to a year in prison.</p>
<h2>Taiwan</h2>
<p>Key Points:</p>
<ul>
<li>A self-governing democracy that maintains strict constitutional barriers against state coercion in digital spaces.</li>
<li>Article 95 of the Taiwanese Code of Criminal Procedure explicitly protects the right against self-incrimination.</li>
<li>The prevailing legal consensus views compelled password disclosure as a violation of procedural protections.</li>
<li>Authorities are unable to legally criminalize a suspect&#8217;s refusal to supply an encryption key, relying instead on independent forensic capabilities.</li>
</ul>
<p>Taiwan operates as a self-governing democracy, maintaining its distinct political and legal systems while navigating the persistent geopolitical threat of annexation. This ongoing reality has fostered a legal environment that is highly sensitive to state overreach and firmly anchored in democratic due process.</p>
<p>Reflecting these principles, the jurisdiction maintains strict barriers against state coercion in digital investigations. <a href="https://law.moj.gov.tw/ENG/LawClass/LawAll.aspx?pcode=C0010001">Article 95</a> of the Taiwanese Code of Criminal Procedure explicitly protects a suspect&#8217;s right against self-incrimination, a safeguard that the legal community applies directly to digital devices. The prevailing legal consensus across the country is that compelling an individual to disclose a password or encryption key constitutes a clear violation of these core procedural protections. Consequently, authorities cannot legally criminalize a suspect&#8217;s refusal to unlock a device. To access encrypted evidence, Taiwanese law enforcement must rely entirely on independent forensic capabilities and technical analysis rather than legal compulsion, preserving a definitive boundary between state investigative powers and the right to remain silent.</p>
<h2>The Chinese Ecosystem: A Comparative Summary</h2>
<p>Observing this specific geographic and political cluster provides a clear illustration of how different governance models directly shape the rules of digital evidence.</p>
<p>Mainland China anchors one end of the spectrum, operating under a framework where individual digital autonomy is structurally subordinated to the mandates of the security apparatus. The legal baseline strongly favors state access, placing the clearest and strictest decryption duties on network operators, telecommunications companies, and internet service providers.</p>
<p>Hong Kong currently occupies a practical middle ground. The territory applies these state-first compliance rules strictly to matters involving national security, while generally preserving its traditional protections against self-incrimination for standard criminal investigations.</p>
<p>Across the strait, Taiwan offers a direct contrast by actively defending the right against self-incrimination across the board. The jurisdiction maintains a recognized boundary between the state&#8217;s investigative reach and individual procedural rights, leaving the burden of decryption entirely on the authorities.</p>
<h2>Japan</h2>
<p>Key Points:</p>
<ul>
<li>A system navigating the friction between constitutional protections against self-incrimination and statutory requests for investigative cooperation.</li>
<li>Article 38 of the Constitution guarantees the right to silence and prohibits coerced confessions.</li>
<li>Article 111-2 of the Code of Criminal Procedure allows authorities executing a warrant to ask suspects to operate a device or provide &#8220;some other form of cooperation,&#8221; a phrase understood to include decryption.</li>
<li>The core ambiguity lies in this permissive wording: the statute authorizes a request for cooperation, but does not resolve how far investigators can push that request before violating Article 38.</li>
</ul>
<p>Japan’s justice system faces a unique friction point when it comes to digital evidence. At the constitutional level, Article 38 guarantees the right to remain silent and prohibits convictions based on coerced confessions. However, this foundational shield intersects with the country’s Code of Criminal Procedure. Under Article 111-2, when officers execute a search or seizure warrant involving electronic-record media, they may ask the person subject to the measure to operate the computer or provide &#8220;some other form of cooperation&#8221; &#8211; a clause understood to include decryption.</p>
<p>This is where the ambiguity arises. Article 111-2 gives investigators a statutory basis to request assistance, but its wording is strictly permissive. Officers &#8220;may ask&#8221; for cooperation, but the law does not expressly criminalize refusal or settle how far that request can be pressed against Article 38’s protection against self-incrimination. Because pushing a suspect to unlock a device raises immediate constitutional concerns, Japanese law enforcement tends to tread carefully. Rather than relying strictly on direct legal pressure, investigators lean on third-party forensic tools to unlock devices independently, or use standard interrogation to persuade suspects to grant access voluntarily. This approach reflects a system actively trying to navigate the digital barrier while reducing the risk of a constitutional clash.</p>
<h2>South Korea</h2>
<p>Key Points:</p>
<ul>
<li>A highly digitized society actively debating how digital evidence intersects with the right to remain silent.</li>
<li>Constitutional protections against self-incrimination frequently stall investigations when suspects refuse to unlock devices.</li>
<li>A prominent 2020 legislative push attempted to criminalize the refusal to provide a password, but the proposal was ultimately scrapped.</li>
<li>The push faced severe pushback from civil rights advocates, maintaining that forced decryption violates fundamental constitutional boundaries.</li>
</ul>
<p>South Korea presents the case of a highly digitized society still debating how to handle digital evidence. Rooted in constitutional protections against self-incrimination, the legal system frequently runs into a familiar barrier when suspects withhold their passcodes. This creates a distinct bottleneck for law enforcement, as prosecutors and police often find their inquiries stalled when key figures simply refuse to unlock their smartphones or computers.</p>
<p>To address this investigative standstill, there was a prominent legislative push in 2020 to introduce a compelled decryption law. The proposal sought to penalize non-compliance, effectively treating the refusal to provide a phone password as an independent criminal offense. However, the effort was later scrapped amid intense criticism and human rights concerns. Civil rights groups and legal professionals argued that forcing an individual to disclose a password crosses a fundamental boundary, directly violating the constitutional protection against compelled self-incrimination.</p>
<p>This leaves South Korea in an unsettled position. While authorities actively seek a statutory mechanism to ensure investigative access, the legal basis for compelling an ordinary suspect to unlock a personal device remains contested in light of existing constitutional safeguards.</p>
<h2>Conclusion</h2>
<p>Ultimately, the East Asian landscape provides a clear map of a growing bifurcation in global law regarding compelled decryption. Looking at the region&#8217;s varied approaches to digital forensics and individual rights, two distinct paths emerge. Jurisdictions structured around centralized state control generally give investigative authorities broader access powers and impose stronger duties of technical assistance, especially on regulated service providers. In contrast, jurisdictions that emphasize democratic due process continue to work through the everyday friction between modern investigative demands and the historical preservation of the right against self-incrimination.</p>
</div>]]></content:encoded>
					
		
		
		<enclosure url="https://blog.elcomsoft.com/wp-content/uploads/2026/04/law-3-1200x630-1.png" length="835603" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/04/law-3-1200x630-1.png" width="1200" height="630" medium="image" type="image/jpeg">
	<media:copyright>ElcomSoft blog</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/04/law-3-1200x630-1.png" width="1200" height="630" />
	</item>
		<item>
		<title>Digital Rights vs. State Power &#8211; The Protectors</title>
		<link>https://blog.elcomsoft.com/2026/04/digital-rights-vs-state-power-the-protectors/</link>
		
		<dc:creator><![CDATA[Oleg Afonin]]></dc:creator>
		<pubDate>Wed, 01 Apr 2026 14:21:30 +0000</pubDate>
				<category><![CDATA[General]]></category>
		<category><![CDATA[law]]></category>
		<guid isPermaLink="false">https://blog.elcomsoft.com/?p=13036</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-2-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div>The first part of this series examined jurisdictions that have adopted a coercive approach to cryptographic barriers. Nations such as the United Kingdom, Australia, and France navigate the practical hurdles of end-to-end encryption through statutory workarounds. Rather than attempting to break the encryption itself, these legal systems apply pressure directly to the device owner &#8211; [&#8230;]</div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-2-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div><p>The first part of this series examined jurisdictions that have adopted a coercive approach to cryptographic barriers. Nations such as the United Kingdom, Australia, and France navigate the practical hurdles of end-to-end encryption through statutory workarounds. Rather than attempting to break the encryption itself, these legal systems apply pressure directly to the device owner &#8211; even if the owner is the suspect. By treating the refusal to provide decryption keys or passwords as a standalone criminal offense, they effectively bypass the technical roadblock. Under this model, non-compliance triggers its own set of penalties, entirely separate from the underlying investigation.</p>
<p>This second part turns to a different legal paradigm, examining jurisdictions that prioritize established civil liberties over state access. The focus here rests on legal frameworks maintaining that foundational constitutional protections &#8211; most notably the privilege against self-incrimination &#8211; cannot be set aside simply to accommodate investigative convenience. For these nations, forcing a suspect to produce a password or PIN crosses a critical line, shifting from the lawful seizure of physical evidence to the unlawful compulsion of testimony. The following chapters explore how these countries structure their laws to preserve traditional rights, even as the realities of modern digital forensics present unprecedented challenges for law enforcement.</p>
<h2>The Protectors of Digital Rights</h2>
<p>In contrast to jurisdictions that penalize a suspect&#8217;s refusal to decrypt personal devices, a distinct cohort of nations &#8211; prominently including Canada, Germany, Belgium, and the Netherlands &#8211; anchors its approach in established constitutional and procedural safeguards. Rather than treating investigative access as an absolute imperative, these legal systems maintain that the fundamental privilege against self-incrimination must endure in the digital arena. Within this framework, courts and legislators generally separate the lawful oversight of third-party telecommunications providers from the direct coercion of the individual suspect.</p>
<h2>Canada</h2>
<p>Canada stands against the coercive decryption model. The Canadian judicial system evaluates these modern police practices strictly through the lens of the Canadian Charter of Rights and Freedoms, specifically Section 7, which guarantees the right to life, liberty, and security of the person. This section has been heavily interpreted to encompass the right to silence and robust protection against self-incrimination. Canadian courts have consistently rebuked attempts by the Crown to force suspects to divulge passwords.</p>
<p>In the pivotal case of R. v. Shergill (<a href="https://www.canlii.org/en/on/oncj/doc/2019/2019oncj54/2019oncj54.html">2019 ONCJ 54</a>), the Ontario Court of Justice faced a novel application by the Crown seeking an &#8220;assistance order&#8221; under section 487.02 of the Criminal Code to compel an accused to unlock a seized cellphone. The police openly admitted they lacked the technology to bypass the phone&#8217;s security without destroying the data. Justice Philip Downes explicitly recognized the immense challenges law enforcement faces with modern encryption but ultimately balanced the public interest in prosecuting crimes against the accused&#8217;s liberty interests under the Charter. The court flatly refused to order the accused to unlock the smartphone, reinforcing that the right to silence unequivocally extends into the digital realm.</p>
<p>This protective stance was further reinforced by the Ontario Court of Appeal in R. v. O’Brien (<a href="https://www.canlii.org/en/on/onca/doc/2023/2023onca197/2023onca197.html">2023 ONCA 197</a>). In O’Brien, the court approached police demands for device passwords through the lens of the Charter, emphasizing the inherently coercive circumstances of the search, the lack of truly informed consent, and the failure to provide a meaningful opportunity to exercise the right to counsel before access was obtained. The decision underscores that consent to search a digital device cannot be manufactured through state pressure, confusion, or ignorance of one’s rights.</p>
<h2>Germany</h2>
<p>Germany operates as a prominent stronghold for digital privacy within the European Union, anchoring its approach to compelled decryption in the foundational legal principle of <em>nemo tenetur se ipsum accusare</em> &#8211; the right not to incriminate oneself. Under the German Code of Criminal Procedure (Strafprozessordnung, or StPO), the right to silence is protected in the digital realm. While investigators are fully authorized to seize locked devices and attempt to extract data using their own forensic tools, the law strictly prohibits forcing suspects to surrender their personal passwords, PINs, or cryptographic keys. Compelling an individual to actively unlock their own device is treated as an unacceptable breach of this core procedural safeguard, ensuring that suspects cannot be coerced into handing over the exact tools needed for their own prosecution.</p>
<p>This procedural shield is heavily reinforced by Germany’s highest court. In a landmark 2008 ruling, the Federal Constitutional Court (Bundesverfassungsgericht) established a specialized constitutional protection: the fundamental right to the confidentiality and integrity of information technology systems (often referred to as the &#8220;IT fundamental right&#8221;). Derived from the broader right of personality outlined in the German Basic Law, this ruling set a high judicial bar for state interference with personal computing devices. The court effectively recognized that modern digital devices require explicit and stringent protections against unwarranted state access.</p>
<p>Consequently, the German legal framework places the technical burden of decryption entirely on the state. If law enforcement encounters a heavily encrypted smartphone, investigators must rely on their own technical resources to bypass the security, and cannot jail the device owner for refusing to provide the password.</p>
<h2>Netherlands</h2>
<p>Dutch legislation embeds the privilege against self-incrimination directly into its statutory text regarding digital forensics. Article 125k of the Dutch Code of Criminal Procedure empowers investigators to issue a decryption order to individuals who possess specific technical knowledge of the encrypted system in question. However, the statute explicitly dictates that this legal order absolutely cannot be issued to the suspect. Tellingly, this core protection has been actively preserved even as the country undergoes a sweeping, multi-year Modernization of Criminal Procedure.</p>
<h2>The Biometric Compromise</h2>
<p>The global legal landscape surrounding compelled decryption is vast, and this analysis does not attempt to review every jurisdiction. Many nations would comfortably align with either the coercive or protective camps. For the purposes of this scope, developing nations, autocracies such as China and Russia, and much of the Asian region &#8211; including Japan and South Korea &#8211; are still waiting to be explored. Comprehensively cataloging every national approach in a single text, or even two, is impossible and falls outside the primary goal of examining this core legal divide.</p>
<p>Between the extremes of criminalizing digital silence and absolutely protecting it, a widespread compromise has emerged across numerous jurisdictions: compelled biometric unlock. As device manufacturers transitioned away from typed passwords and seamlessly integrated fingerprint and facial recognition technologies, they unintentionally provided law enforcement with a legal loophole. This rapid technological shift introduced a profound vulnerability into the jurisprudence surrounding self-incrimination.</p>
<p>Constitutional protections against self-incrimination are fundamentally designed to shield the &#8220;contents of the mind,&#8221; preventing the state from forcing an individual to share a memorized code or articulate a thought. Conversely, the law has long maintained that physical traits do not constitute compelled testimony. This strict legal dichotomy is deeply rooted in nineteenth-century forensic science. Following the development of Alphonse Bertillon&#8217;s anthropometry (Bertillonage) and the global rise of dactyloscopy, courts universally accepted that measuring a suspect&#8217;s physical body or taking their fingerprint did not violate the right to silence.</p>
<p>Modern prosecutors have utilized this historical distinction. Because a fingerprint or a face is classified as a physical attribute rather than a memorized secret, courts routinely rule that forcing a suspect to unlock a device via biometrics does not trigger self-incrimination protections. Armed with this reasoning, law enforcement agents regularly secure search warrants authorizing them to physically press a suspect&#8217;s thumb onto a scanner or forcibly hold a smartphone to a suspect&#8217;s face. Under this framework, the human body is legally reduced to a physical key.</p>
<p>This creates a procedural paradox. A suspect who relies on a traditional PIN can legally invoke their right to silence and refuse to surrender the code, while a suspect utilizing a biometric shortcut for the exact same cryptographic code can be physically restrained and forced to unlock the device without any recognized constitutional violation occurring. The legal outcome relies entirely on the technical distinction between what the suspect knows versus what the suspect is.</p>
<p>While this biometric workaround remains widely exploited by authorities, there are signs of judicial resistance. In January 2025, the D.C. Circuit Court of Appeals issued a potentially landmark ruling in United States v. Brown, concluding that compelling a defendant to unlock a cellphone with a fingerprint actually violated his Fifth Amendment right against self-incrimination. The court reasoned that the compelled biometric unlock was inherently &#8220;testimonial&#8221; because it explicitly communicated the suspect&#8217;s control and ownership over the device and its hidden contents.</p>
<h2>Conclusion</h2>
<p>The global legal landscape is currently defined by a collision between modern cryptography and the coercive power of the state. In an effort to bypass encryption, several jurisdictions have granted police the leverage to force suspects to unlock their devices under the threat of severe penalties. However, this approach effectively hollows out the ancient legal principle of <em>nemo tenetur se ipsum accusare &#8211; </em>the right against self-incrimination.</p>
<p>This increasing reliance on statutory coercion, coupled with the exploitation of outdated biometric loopholes, signals a significant shift in the balance of legal power. If these practices continue unabated, the fundamental protection against forced self-incrimination risks surviving only as a theoretical construct on paper, largely disconnected from the digital reality where everyday human life now occurs.</p>
<h2>References</h2>
<p><strong>Canada</strong></p>
<ul>
<li><a href="https://cantechlaw.ca/en/news/12846146">Forced Unlocking Amounts to Self-Incrimination &#8211; Canadian Technology Law Association</a></li>
<li><a href="https://blog.privacylawyer.ca/2019/04/ontario-court-refuses-to-order-accused.html">Ontario court refuses to order accused to unlock his smartphone &#8211; Privacy Lawyer</a></li>
<li><a href="https://www.mccarthy.ca/en/insights/blogs/canadian-appeals-monitor/the-right-to-silence-carries-the-right-to-keep-passwords-secret">The Right to Silence Carries the Right to Keep Passwords Secret &#8211; McCarthy Tétrault LLP</a></li>
</ul>
<p><strong>Netherlands</strong></p>
<ul>
<li><a href="https://iclg.com/practice-areas/cybersecurity-laws-and-regulations/netherlands">Cybersecurity Laws and Regulations Report 2026 Netherlands &#8211; ICLG.com</a></li>
</ul>
<p><strong>Germany</strong></p>
<ul>
<li><a href="https://www.bundesverfassungsgericht.de/SharedDocs/Entscheidungen/EN/2008/02/rs20080227_1bvr037007en.html">Bundesverfassungsgericht &#8211; Decisions search &#8211; Judgment of 27 February 2008</a></li>
<li><a href="https://de.wikipedia.org/wiki/Grundrecht_auf_Gew%C3%A4hrleistung_der_Vertraulichkeit_und_Integrit%C3%A4t_informationstechnischer_Systeme">Grundrecht auf Gewährleistung der Vertraulichkeit und Integrität informationstechnischer Systeme – Wikipedia</a></li>
</ul>
<p><strong>Biometric Unlock (Various Jurisdictions)</strong></p>
<ul>
<li><a href="https://www.lawfaremedia.org/article/fifth-amendment-decryption-and-biometric-passcodes">The Fifth Amendment, Decryption and Biometric Passcodes &#8211; Lawfare</a></li>
<li><a href="https://www.newamerica.org/insights/biometrics-vs-fifth-amendment/">Biometrics vs. the Fifth Amendment &#8211; New America</a></li>
<li><a href="https://fedsoc.org/commentary/fedsoc-blog/do-compelled-biometrics-violate-the-fifth-amendment-a-deepening-split-among-lower-courts">Do Compelled Biometrics Violate the Fifth Amendment? A Deepening Split Among Lower Courts &#8211; The Federalist Society</a></li>
<li><a href="https://www.purduegloballawschool.edu/blog/constitutional-law/fifth-amendment-biometrics">Does the Fifth Amendment Protect Biometrics? &#8211; Purdue Global Law School</a></li>
<li><a href="https://www.arnoldporter.com/en/perspectives/advisories/2025/03/when-your-fingers-do-the-talking">When Your Fingers Do the Talking: D.C. Circuit Rules That Compelled Opening of Cellphone With Fingerprint Violates the Fifth Amendment &#8211; Arnold &amp; Porter</a></li>
</ul>
<div id="gtx-trans" style="position: absolute; left: 216px; top: 851.833px;">
<div class="gtx-trans-icon"></div>
</div>
</div>]]></content:encoded>
					
		
		
		<enclosure url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-2-1200x630-1.png" length="968006" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-2-1200x630-1.png" width="1200" height="630" medium="image" type="image/jpeg">
	<media:copyright>ElcomSoft blog</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-2-1200x630-1.png" width="1200" height="630" />
	</item>
		<item>
		<title>The Geography of Coercion: a Study of Compelled Decryption Laws</title>
		<link>https://blog.elcomsoft.com/2026/03/the-geography-of-coercion-a-study-of-compelled-decryption-laws/</link>
		
		<dc:creator><![CDATA[Oleg Afonin]]></dc:creator>
		<pubDate>Tue, 31 Mar 2026 06:16:25 +0000</pubDate>
				<category><![CDATA[General]]></category>
		<category><![CDATA[law]]></category>
		<guid isPermaLink="false">https://blog.elcomsoft.com/?p=13030</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-1-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div>On March 23, 2026, the Hong Kong government amended the rules of its National Security Law, making it a criminal offense to refuse police passwords or decryption assistance for personal devices. When I read the security alert, my initial plan was simply to compile a list of jurisdictions with similar laws. That catalog quickly outgrew [&#8230;]</div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-1-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div><p>On March 23, 2026, the Hong Kong government amended the rules of its National Security Law, making it a criminal offense to refuse police passwords or decryption assistance for personal devices. When I read the security alert, my initial plan was simply to compile a list of jurisdictions with similar laws. That catalog quickly outgrew its premise. Tracking these statutes revealed a fractured global approach to digital privacy and state power, resulting in a comparative study too broad for a single article. I decided to split the research into two parts. This first installment examines the countries that criminalize digital silence.</p>
<p>The catalyst for these laws is strictly technological. Modern electronic devices hold an unprecedented concentration of our private lives. At the same time, the widespread adoption of strong encryption has built mathematical walls that investigative agencies cannot breach. Faced with this barrier, the state&#8217;s focus has shifted. Unable to break the device, governments increasingly use the threat of separate jail time to compel the user to open it.</p>
<p>This dynamic forces a re-examination of the right against self-incrimination. The core debate asks whether forcing a suspect to hand over a password is akin to surrendering a physical key, like a safe combination, or if it constitutes forcing them to testify against themselves. This article maps out the jurisdictions that bypass that debate entirely, choosing instead to make the refusal to decrypt an independent crime.</p>
<h2>The Offenders</h2>
<p>As I mapped out how different legal systems handle the encryption roadblock, a distinct, highly pragmatic pattern emerged. The philosophical debate over whether a password is a physical key or a piece of testimony is legally rich, but it is also incredibly slow. For an investigator staring at a locked device that might hold the centerpiece of a case, abstract constitutional theory offers little immediate use.</p>
<p>A growing number of jurisdictions have decided they simply do not have the patience for that debate. Rather than untangling the historical nuances of self-incrimination, they bypassed the problem entirely by creating a brand new crime.</p>
<p>In these countries, digital silence is a standalone offense. The logic is a fascinating piece of legislative circumvention: the state is not punishing you for whatever illicit material might be hidden on your hard drive because they cannot see it. Instead, you are prosecuted strictly for the act of keeping the door locked. By severing the refusal to provide a password from the underlying criminal investigation, these governments have weaponized the penal code against the locked screen itself.</p>
<p>It is a blunt, highly effective workaround to the mathematics of encryption. Here is how that hardline approach plays out on the ground.</p>
<h3>The United Kingdom</h3>
<p>When tracing the origins of the coercive model, the path leads directly to the United Kingdom. The British approach serves as a primary blueprint for how a state can pivot its penal code against a locked screen.</p>
<p>The mechanism driving this is the Regulation of Investigatory Powers Act 2000, commonly known as RIPA. Under Section 49 of the act, police and other authorities can serve a formal notice demanding a suspect hand over their password, PIN, or decryption key. To issue this demand, authorities need a reasonable belief that the person knows the code and that access is necessary to prevent crime, protect national security, or safeguard the country&#8217;s economic well-being; the notice also has to be necessary, proportionate, and used where it is not reasonably practicable to obtain the intelligible information another way.</p>
<p>The real teeth of the legislation reside in Section 53. Failing to comply with a Section 49 notice without a lawful excuse is classified as a standalone criminal offense. The penalties are explicitly designed to act as a coercive lever: a standard refusal carries a maximum two-year prison sentence, but if the underlying investigation involves national security or indecent images of children, that penalty jumps to five years.</p>
<p>Watching this dynamic operate in practice, the legal paradox it creates inside an interrogation room is striking. Suspects are forced into a high-stakes calculation, having to immediately weigh whether the data on their device is so incriminating that taking a guaranteed prison sentence for silence is a better bet than facing the primary charges.</p>
<p>It is easy to mischaracterize how this looks on the ground. When I first started assembling notes for this section, I looked at the case of <a href="https://www.bbc.com/news/uk-england-hampshire-48994913">Stephen Nicholson</a>, who refused to hand over his Facebook password during the 2018 Lucy McHugh murder investigation. It is tempting to frame his resulting prison sentence as an example of a man locked up &#8220;strictly for withholding a password&#8221; &#8211; but looking at the actual timeline breaks that narrative.</p>
<p>Nicholson was the prime suspect in a murder case. When he refused to yield his password, the state didn&#8217;t wait for the homicide investigation to conclude. They hit him with a RIPA charge immediately, resulting in a 14-month prison sentence in August 2018. This allowed authorities to effectively incarcerate an uncooperative murder suspect while they built the primary case against him. By July 2019, Nicholson was convicted of the rape and murder. In this context, RIPA functioned less as a standalone punishment and more as an immediate tactical holding maneuver for a much darker crime.</p>
<p>To see how the law operates completely independent of underlying guilt, a clearer example is the case of Tajan Spalding. Spalding was handed an eight-month prison sentence for refusing to provide the passcodes to his iPhone and iPad during a drug investigation.</p>
<h2>France: an Act of Staying Silent as an Active Obstruction of Justice</h2>
<p>France, a country whose national motto was Liberté, égalité, fraternité, has taken a surprisingly hardline stance on digital privacy under Article 434-15-2 of its Penal Code. This law makes it a severe criminal offense to refuse to hand over a decryption key or password to law enforcement during an investigation. For a while, there was legal back-and-forth about whether a standard smartphone PIN actually counted as a decryption key or just a simple lock. However, in late 2022, France&#8217;s highest court, the Court of Cassation, definitively settled the debate: a phone PIN is legally recognized as a decryption tool <em>if</em> the phone is encrypted. Refusing to provide it to the police can land you with a prison sentence of up to three years and a staggering €270,000 fine.</p>
<p>The key nuance here is that this obligation falls on absolutely anyone who knows the code, up to and including the suspect themselves. This raises obvious and tricky questions about the right against self-incrimination, a cornerstone of many legal systems. The French courts, however, have effectively bypassed this protection by framing the refusal not as an act of staying silent, but as an active obstruction of justice. They argue the code isn&#8217;t inherently incriminating evidence; it&#8217;s just the key to access a space where evidence might be.</p>
<p>For the average private citizen, this means the old idea that &#8220;my phone is my private digital vault&#8221; doesn&#8217;t hold much water if you&#8217;re ever taken into custody. You can be hit with heavy criminal charges purely for keeping your passcode to yourself, completely separate from whatever crime the police were originally investigating.</p>
<p>To be fair, the Court of Cassation did introduce a technical caveat in its <a href="https://www.courdecassation.fr/decision/6368dc51f1ea8a7f744fbf98">November 2022 ruling</a>, clarifying that a phone PIN isn&#8217;t automatically considered a &#8220;decryption key&#8221; by default. The judges specified that for the offense to trigger, the state must prove the specific device is actively equipped with encryption software and that the PIN actually unscrambles the data, rather than just acting as a simple home screen lock. While this sounds like a meaningful privacy safeguard on paper, it is a slightly cynical moot point in the real world. Virtually all modern smartphones come with full-device encryption enabled out of the box. Ironically, if a suspect happened to possess an older, unencrypted phone where the passcode merely locked the screen, law enforcement wouldn&#8217;t need to compel them to reveal the code anyway &#8211; they could easily bypass the lock and pull the data using standard forensic extraction tools. So, while the court drew a careful legal distinction, the everyday reality of consumer tech ensures the threat of prosecution remains practically universal.</p>
<h2>Belgium</h2>
<p>The Belgian legislative model, primarily governed by Article 88quater of the Code of Criminal Procedure, offers a deeply nuanced approach that actively splits the difference between physical action and abstract knowledge. The law draws a sharp distinction regarding what can actually be compelled. Under §2 of the article, an investigating judge can order a suitable person to operate a computer system or carry out technical actions; crucially, that order cannot be directed at the suspect, thereby protecting the accused from being forced to actively participate in the search of their own device. However, §1 creates a significant caveat by allowing authorities to require access-enabling information from a person believed to possess the necessary knowledge, and Belgian case law has confirmed that this can include the suspect.</p>
<p>Following rulings by the Belgian Court of Cassation and the Constitutional Court in 2020, the practical position of Belgian law is that, while the state cannot force a suspect to personally unlock or operate a seized phone under §2, it can punish refusal to disclose access codes under §1 and §3, on the reasoning that such codes constitute pre-existing information rather than compelled participation in the collection of evidence.</p>
<p>In essence, Belgium draws a procedural distinction between forcing the suspect to operate the device and forcing the suspect to reveal the secret needed to unlock it. But for the purposes of <em>nemo tenetur se ipsum accusare</em>, that distinction does not alter the underlying reality: the state may still compel disclosure of “something that the suspect knows” under threat of punishment. Belgium therefore belongs in the coercive camp.</p>
<h2>Australia</h2>
<p>Looking further afield, the Australian framework caught my attention for its sheer punitive weight. The foundation of this approach was laid when the Cybercrime Act 2001 inserted Section 3LA into the federal Crimes Act. While much of the modern public debate around encryption focuses on forcing major tech companies or telecommunication providers to build systemic backdoors, Section 3LA zeroes in squarely on the individual. It is a legal lever designed exclusively to compel physical persons &#8211; suspects and witnesses alike &#8211; to unlock their personal devices. This leaves corporate responsibility out of the equation as Australia aggressively targets tech companies through parallel legislation like the 2018 TOLA Act.</p>
<p>Under this provision, if law enforcement believes a device holds evidence of a crime, they can obtain a magistrate&#8217;s order forcing the specified person with knowledge to provide the necessary passwords, PINs, or biometric access. The mechanism is simple: you are handed a warrant, and your failure to immediately decrypt the phone or laptop becomes a standalone federal offense. There is no drawn-out constitutional debate inside the interrogation room about the historical right to silence. The state simply demands the keys, and the individual must make an immediate choice.</p>
<p>What makes the Australian model staggering is the consequence of saying no. In investigations involving serious federal offenses, refusing a Section 3LA order carries a maximum penalty of ten years in prison. This is not a subtle legal nudge; it operates as an overwhelming coercive force. By threatening a decade behind bars strictly for keeping a screen locked, the law effectively neutralizes any practical reliance on the privilege against self-incrimination. It creates an environment where the individual is strong-armed into doing the investigative heavy lifting for the state, rendering their digital silence a highly penalized luxury.</p>
<h2>New Zealand</h2>
<p>As a direct neighbor to this Australian framework, New Zealand offers a compelling bonus case study in how state coercion is applied directly to the individual. Under Section 130 of the Search and Surveillance Act 2012, New Zealand law enforcement officers executing a search warrant possess the explicit authority to compel a physical person to provide the PIN, password, or encryption key to a lawfully seized device. Refusal to comply is not met with a procedural negotiation; it is prosecuted as a standalone offense punishable by up to 3 months’ imprisonment. Like the Australian model, this legislation zeroes in entirely on the personal responsibility of the suspect or witness holding the device, completely bypassing any reliance on the cooperation of external tech corporations.</p>
<p>It is important to note that this domestic investigative power operates parallel to an entirely separate border regime. When I initially looked into New Zealand’s approach, I had to draw a line between standard police powers and customs enforcement. Under the Customs and Excise Act 2018, border agents can demand travelers unlock their devices or hand over passwords under the much lower threshold of having &#8220;reasonable cause to suspect&#8221; wrongdoing. Refusal at the border triggers a distinct $5,000 NZD fine and potential prosecution. While border searches rely on a different legal foundation &#8211; rooted in a sovereign state&#8217;s inherent right to control its ports of entry rather than domestic criminal warrants &#8211; the underlying legislative mechanics remain similar.</p>
<h2>Expanding the Map</h2>
<p>Moving further East, the legal landscape surrounding compelled decryption fractures significantly, ranging from rigid statutory demands to complex constitutional debates.</p>
<p>In <strong>Singapore</strong>, authorities rely heavily on Section 39 of the Criminal Procedure Code (CPC). Following a suite of criminal justice reforms in 2018, Singaporean investigators were granted explicit, broad powers to order any individual to provide their login credentials or assist in decrypting a device. Crucially, this applies to anyone the police reasonably believe has knowledge of the password &#8211; whether they are the primary suspect, a family member, or a bystander. Refusing to hand over a passcode or help bypass security measures during a criminal investigation is treated as a direct obstruction of justice. The practical penalty reflected in a Singapore Police press release is up to S$5,000, 6 months, or both.</p>
<p>In <strong>Hong Kong</strong>, the legal boundary for forced decryption is explicitly tied to &#8220;National Security,&#8221; though the practical application of that term is notoriously broad. Under the March 2026 amendments to the Implementation Rules of the National Security Law (NSL), police officers can demand device passwords, decryption keys, or technical assistance. On March 27, the Hong Kong government publicly said police may require a password only after legal authorization to search the device has been obtained, and said there is no power to randomly demand passwords from ordinary people on the street. While this rule is technically confined to national security investigations, it applies to everyone physically present in the jurisdiction, both residents and visitors. Refusing to unlock a phone or laptop is a standalone criminal offense punishable by up to a year in prison and a hefty fine.</p>
<h2>Contested Jurisdictions</h2>
<p>While some nations have deployed blunt statutes to mandate decryption, other major democracies find themselves caught in a constitutional tug-of-war. In contested jurisdictions like the United States and India, law enforcement&#8217;s push for digital access constantly collides with deeply entrenched protections against self-incrimination. Rather than relying on a single, clear-cut law, these countries navigate an evolving patchwork of local court rulings, procedural workarounds, and ongoing legal debates. For the average person, this turns digital privacy into a highly unpredictable gray area, where the right to keep a passcode secret often depends less on unified legislation and more on how a specific judge decides to apply older legal principles to the smartphone era.</p>
<h3>United States</h3>
<p>Unlike the statutory coercion found in the UK and Australia where refusing to decrypt a device is a standalone criminal offense, the United States relies on procedural workarounds to force compliance. The primary hurdle for US law enforcement is the Fifth Amendment&#8217;s protection against self-incrimination, which generally covers the &#8220;testimonial&#8221; act of retrieving a password from your memory. To bypass this, prosecutors frequently invoke the &#8220;Foregone Conclusion&#8221; doctrine. If the government can prove they already know a suspect owns the device, knows the passcode, and can generally identify the files inside, courts in some jurisdictions have held that unlocking the device adds nothing to the government&#8217;s knowledge. This effectively strips away the Fifth Amendment shield. If a private citizen still refuses to comply with a judge&#8217;s decryption order, they aren&#8217;t charged with a new data-related crime; instead, they are held in civil contempt of court until they yield.</p>
<p>This procedural tactic has created a significant legal loophole, illustrated by the harrowing case of Francis Rawls. Suspected of possessing illicit digital material, Rawls was ordered to decrypt his hard drives. When he claimed he could not remember the passwords, the judge held him in civil contempt. The federal Recalcitrant Witness Statute is explicitly designed to cap this kind of coercive confinement at 18 months. However, prosecutors successfully argued for years that Rawls was a suspect rather than a mere witness, keeping the statutory limit at bay. As a result, Rawls was held in a federal detention center for over four years without ever facing a trial, a jury, or formal criminal charges. He was ultimately released in 2020 only after the Third Circuit Court of Appeals ruled the 18-month cap applied to him, highlighting how civil contempt can morph into an indefinite, uncharged prison sentence.</p>
<p>Rawls&#8217;s ordeal contrasts sharply with how similar cases are handled elsewhere in the country, showcasing the precarious geographic lottery of US digital rights. In the 2012 case In re Grand Jury Subpoena, the 11th Circuit Court of Appeals took a much stricter stance on the Foregone Conclusion doctrine. They ruled that because the government couldn&#8217;t identify with &#8220;reasonable particularity&#8221; what specific files were hidden on the encrypted drives, forcing the suspect to decrypt them fundamentally violated the Fifth Amendment. Because the US Supreme Court has yet to issue a definitive, nationwide ruling on compelled decryption, a citizen&#8217;s fundamental right to digital privacy &#8211; and their risk of being jailed for contempt &#8211; currently depends entirely on which federal circuit they happen to reside in.</p>
<h3>India</h3>
<p>In <strong>India</strong>, the legal landscape is currently caught in a tense tug-of-war between statutory power and constitutional rights that can be best described as &#8220;unsettled&#8221;. On one side is Section 69 of the Information Technology Act, 2000, which broadly compels individuals to provide technical assistance to decrypt data or face up to seven years in prison. On the other side is the Indian Constitution &#8211; specifically Article 20(3), which protects against self-incrimination, alongside the landmark Puttaswamy judgment that enshrined privacy as a fundamental right. This clash has sparked ongoing legal debates over whether forcing a suspect to hand over a memorized passcode constitutes an unconstitutional extraction of protected knowledge from their mind. Karnataka and Kerala High Courts have taken views allowing compelled disclosure, while the Delhi High Court said an accused could not be coerced to reveal passwords during an ongoing trial. Reporting from March 2026 also indicates the Kerala High Court has stayed a compelled-passcode order pending review.</p>
<h2>Conclusion</h2>
<p>Ultimately, the global patchwork of compelled decryption laws points to an ongoing shift in how justice systems navigate the digital age. Whether relying on explicit statutory offenses or procedural workarounds like civil contempt, authorities are steadily bypassing the <em>nemo tenetur</em> principle &#8211; the long-established safeguard against self-incrimination. From an analytical standpoint, this trend isn&#8217;t a simple matter of right versus wrong. Law enforcement naturally seeks the most efficient path to secure evidence, and encrypted devices present a genuine hurdle. However, redefining a memorized passcode as a mere physical key rather than protected personal knowledge structurally alters the legal landscape.</p>
<p>The core issue with eroding this principle is the resulting imbalance of power. The state already holds a natural monopoly on legal force and authority. Constitutional rights and historical legal doctrines exist specifically to counterbalance that weight, ensuring private citizens have a baseline defense against the machinery of prosecution. When courts and legislatures steadily chip away at the right to remain digitally silent, they tilt that scale heavily in the state&#8217;s favor. It establishes a dynamic where traditional boundaries on government power are quietly downgraded for the sake of technological convenience, significantly reshaping the equilibrium between the individual and the state.</p>
<h2>Sources</h2>
<ul>
<li><a href="https://www.pastpaperhero.com/resources/uk-legal-s49-ripa-notice" target="_blank" rel="noopener">Section 49 RIPA Notices: Duties, Defences and Risks &#8211; PastPaperHero</a></li>
<li><a href="https://www.saunders.co.uk/news/prosecuted-for-your-password/" target="_blank" rel="noopener">Prosecuted for your password &#8211; Saunders Law</a></li>
<li><a href="https://ngm.com.au/3la-orders-crimes-act-1914/" target="_blank" rel="noopener">3LA orders Crimes Act 1914 &#8211; NGM Lawyers</a></li>
<li><a href="https://iapp.org/news/a/australias-first-in-the-world-decryption-laws-will-impact-tech-providers-globally" target="_blank" rel="noopener">Australia&#8217;s first-in-the-world &#8216;decryption&#8217; laws will impact tech providers globally | IAPP</a></li>
<li><a href="https://pcs-avocat.com/en/article-434-15-2-of-the-penal-code-refusal-to-unlock-ones-phone-a-new-criminal-offense/" target="_blank" rel="noopener">Article 434-15-2 of the Penal Code: Refusal to unlock one&#8217;s phone, a new criminal offense &#8211; PCS Avocat</a></li>
<li><a href="https://www.fairtrials.org/articles/news/french-court-rules-that-refusing-to-disclose-a-mobile-passcode-to-law-enforcement-is-a-criminal-offence/" target="_blank" rel="noopener">French Court rules that refusing to disclose a mobile passcode to law enforcement is a criminal offence &#8211; Fair Trials</a></li>
<li><a href="https://www.channelnewsasia.com/east-asia/new-hong-kong-rule-give-passwords-security-cases-6011231" target="_blank" rel="noopener">New Hong Kong rules force people to give up passwords in security cases &#8211; CNA</a></li>
<li><a href="https://www.the-independent.com/asia/china/hong-kong-police-passwords-phones-national-secuirty-law-b2944261.html" target="_blank" rel="noopener">Hong Kong police can now demand phone and computer passwords under new national security rule &#8211; The Independent</a></li>
<li><a href="https://sso.agc.gov.sg/Act/CPC2010" target="_blank" rel="noopener">Criminal Procedure Code 2010 &#8211; Singapore Statutes Online</a></li>
<li><a href="https://en.wikipedia.org/wiki/Information_Technology_Act,_2000" target="_blank" rel="noopener">Information Technology Act, 2000 &#8211; Wikipedia</a></li>
<li><a href="https://bharatchugh.in/2023/10/07/whether-an-accused-can-be-directed-to-disclose-his-phone-password-by-the-investigators/" target="_blank" rel="noopener">Whether an accused can be directed to disclose his phone password by the investigators? &#8211; Bharat Chugh</a></li>
<li><a href="https://en.wikipedia.org/wiki/United_States_v._Rawls" target="_blank" rel="noopener">United States v. Rawls &#8211; Wikipedia</a></li>
<li><a href="https://www.sophos.com/en-us/blog/suspect-who-refused-to-decrypt-hard-drives-released-after-four-years" target="_blank" rel="noopener">Suspect who refused to decrypt hard drives released after four years | SOPHOS</a></li>
<li><a href="https://law.justia.com/cases/federal/appellate-courts/ca11/11-12268/11-12268-2012-02-23.html" target="_blank" rel="noopener">In Re: Grand Jury Subpoena Duces Tecum Dated March 25, 2011, USA v. John Doe, No. 11-12268 (11th Cir. 2012) &#8211; Justia Law</a></li>
<li><a href="https://www.mbtmag.com/cybersecurity/news/13246557/new-zealand-fines-travelers-who-wont-unlock-secure-devices" target="_blank" rel="noopener">New Zealand Fines Travelers Who Won&#8217;t Unlock Secure Devices &#8211; MBT Mag</a></li>
</ul>
</div>]]></content:encoded>
					
		
		
		<enclosure url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-1-1200x630-1.png" length="825320" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-1-1200x630-1.png" width="1200" height="630" medium="image" type="image/jpeg">
	<media:copyright>ElcomSoft blog</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-1-1200x630-1.png" width="1200" height="630" />
	</item>
		<item>
		<title>Arrested by AI</title>
		<link>https://blog.elcomsoft.com/2026/03/arrested-by-an-algorithm/</link>
		
		<dc:creator><![CDATA[Oleg Afonin]]></dc:creator>
		<pubDate>Fri, 27 Mar 2026 12:32:10 +0000</pubDate>
				<category><![CDATA[General]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[law]]></category>
		<guid isPermaLink="false">https://blog.elcomsoft.com/?p=13018</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-4-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div>In July 2025, a tactical team of United States Marshals descended on the Tennessee home of Angela Lipps, arresting the fifty-year-old grandmother at gunpoint while she watched her young grandchildren. Her apprehension was not the culmination of traditional detective work, but the result of authorities placing undue confidence in an AI-based facial recognition system. An [&#8230;]</div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-4-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div><p>In July 2025, a tactical team of United States Marshals descended on the Tennessee home of Angela Lipps, arresting the fifty-year-old grandmother at gunpoint while she watched her young grandchildren. Her apprehension was not the culmination of traditional detective work, but the result of authorities placing undue confidence in an AI-based facial recognition system. An algorithm had linked a photograph of her face to a counterfeit military identification card used in a sophisticated bank fraud operation over 1,200 miles away in Fargo, North Dakota.</p>
<p>At its core, facial recognition software produces mathematical probabilities, not definitive facts. The technology is designed to offer an investigative lead, closer to an unverified tip phoned in by an anonymous informant than to actual evidence. Yet, in the Lipps investigation, that critical distinction was ignored. Investigators treated the algorithm&#8217;s probabilistic suggestion as actionable evidence, opting to secure a felony arrest warrant rather than conduct basic due diligence. Lipps would spend roughly six months incarcerated as a fugitive from justice before the state&#8217;s case ultimately fell apart. The reality of this breakdown leaves us with a fundamental, unsettling question: how could such a severe, life-altering error have been avoided by something as ordinary as checking a suspect&#8217;s alibi?</p>
<h2>The Cost of Error</h2>
<p>When an automated error crosses from a database into the physical world, the resulting damage is no longer just statistical. For Angela Lipps, the failure to verify an automated lead triggered a cascade of real-world consequences. The incident began with the immediate shock of being taken into custody at gunpoint in the presence of her grandchildren, but the collateral damage extended far beyond the initial arrest. Over the course of nearly half a year spent in jail, the infrastructure of her daily life was effectively dismantled. While she was incarcerated and navigating emerging health issues behind bars, she lost her rental home, her health insurance, and her pet. The public nature of the felony fraud allegations also damaged her reputation within her community. Even after the charges were dropped, she was released in North Dakota on Christmas Eve with no coat and no way home.</p>
<p>The official response only deepened the sense of institutional failure. Outgoing Fargo Police Chief David Zibolski refused to issue a formal, direct apology to Lipps, telling the press that the investigation was ongoing and that it was &#8220;too early&#8221; to completely rule out her involvement. <a href="https://www.mprnews.org/story/2026/03/24/fargo-police-chief-apologizes-for-mistakes-in-aiaided-arrest">An official apology</a> did not come until late March 2026, roughly three months after her release in December 2025.</p>
<h2>The Broader Pattern</h2>
<p>The incident in North Dakota is not an unprecedented anomaly. Instead, it fits into an established pattern of wrongful detentions driven by algorithmic identification. In Detroit, police wrongfully arrested Robert Williams after facial recognition software incorrectly flagged his driver&#8217;s license, an error that eventually cost the city $300,000 in compensation. A similar misstep in New Jersey put Nijeer Parks behind bars following a digital misidentification, which ultimately led to a $150,000 settlement.</p>
<p>These cases point to a much broader issue in modern law enforcement. The core problem is not just that the software occasionally gets it wrong. The real danger lies in human institutions relying on these tools too eagerly. When investigators take a computer&#8217;s probabilistic guess as a fact, they strip away the safeguards of traditional police work. The lesson is straightforward: technology might generate the lead, but it is the human decision to skip basic verification that actually puts innocent people in handcuffs.</p>
<h2>The Question of Liability</h2>
<p>When assessing the legal fallout, the question of liability inevitably points toward human decision-makers rather than the software itself. The primary targets for civil litigation would likely be the City of Fargo and the specific detectives who handled the investigation. An algorithm cannot swear out a warrant or dispatch a tactical team; it merely generates a statistical match. It was human officials who made the active choice to rely on that output while skipping the fundamental step of independently verifying the suspect&#8217;s whereabouts.</p>
<p>Looking ahead, it seems unlikely that this dispute will ever reach a courtroom. Municipalities facing this level of exposure typically push for a quiet, substantial settlement rather than risk the public embarrassment and exhaustive scrutiny of their investigative practices that a trial would bring. Considering that earlier misidentification cases involving only brief detentions yielded payouts of up to $300,000, the financial stakes here are far higher. With Lipps enduring nearly six months of wrongful imprisonment, the duration of her detention alone suggests that any eventual settlement could exceed those seen in earlier algorithmic misidentification cases.</p>
<h2>Technology Can Suggest, Humans Must Decide</h2>
<p>The ultimate takeaway from the Fargo investigation is not that artificial intelligence has no place in modern law enforcement. When utilized correctly, algorithmic tools can be effective at processing vast datasets to identify patterns and generate preliminary leads. The critical failure occurs when a tool&#8217;s output is elevated from a statistical suggestion to an undeniable conclusion.</p>
<p>Technology can calculate probabilities, but it cannot &#8211; and should not &#8211; deliver justice. That burden rests firmly on the investigators who wield it. Outsourcing critical thinking to a machine bypasses the fundamental safeguards designed to protect individuals from unwarranted state action. Artificial intelligence can certainly point investigators in a specific direction, but the foundational duty to verify facts, test alibis, and protect innocent people remains an entirely human responsibility.</p>
<h2>References</h2>
<ol>
<li><a href="https://bringmethenews.com/minnesota-news/fargo-pd-under-scrutiny-after-woman-flagged-by-ai-facial-recognition-is-jailed-for-months">Fargo PD under scrutiny after woman flagged by AI facial recognition is jailed for months &#8211; Bring Me The News</a></li>
<li><a href="https://www.theguardian.com/us-news/2026/mar/12/tennessee-grandmother-ai-fraud">Tennessee grandmother jailed after AI facial recognition error links her to fraud | Tennessee | The Guardian</a></li>
<li><a href="https://www.ndtv.com/world-news/us-woman-wrongly-imprisoned-for-6-months-due-to-faulty-facial-recognition-11209378">US Woman Wrongly Imprisoned For 6 Months Due To Faulty Facial Recognition</a></li>
<li><a href="https://www.valleynewslive.com/2026/03/26/sheriff-says-email-shows-fargo-police-knew-angela-lipps-arrest-months-earlier/">Sheriff says email shows Fargo Police knew of Angela Lipps arrest months earlier</a></li>
<li><a href="https://www.mprnews.org/story/2026/03/24/fargo-police-chief-apologizes-for-mistakes-in-aiaided-arrest">Fargo police chief apologizes for mistakes in AI-aided arrest | MPR News</a></li>
</ol>
</div>]]></content:encoded>
					
		
		
		<enclosure url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-4-1200x630-1.png" length="1274783" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-4-1200x630-1.png" width="1200" height="630" medium="image" type="image/jpeg">
	<media:copyright>ElcomSoft blog</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/law-4-1200x630-1.png" width="1200" height="630" />
	</item>
		<item>
		<title>Distributed Password Recovery Goes 64-bit: Ready for RTX 5090</title>
		<link>https://blog.elcomsoft.com/2026/03/distributed-password-recovery-goes-64-bit-ready-for-rtx-5090/</link>
		
		<dc:creator><![CDATA[Oleg Afonin]]></dc:creator>
		<pubDate>Thu, 26 Mar 2026 09:00:59 +0000</pubDate>
				<category><![CDATA[General]]></category>
		<category><![CDATA[64-bit]]></category>
		<category><![CDATA[EDPR]]></category>
		<category><![CDATA[Nvidia]]></category>
		<guid isPermaLink="false">https://blog.elcomsoft.com/?p=12682</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/01/EDPR64-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div>We have just released a major update to Elcomsoft Distributed Password Recovery. While the release notes might simply say &#8220;migrated to 64-bit,&#8221; the reality under the hood is far more complex and significant. This is not a cosmetic update or a simple recompile; it is a fundamental architectural shift necessitated by the evolution of GPU [&#8230;]</div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/01/EDPR64-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div><p>We have just released a major update to Elcomsoft Distributed Password Recovery. While the release notes might simply say &#8220;migrated to 64-bit,&#8221; the reality under the hood is far more complex and significant. This is not a cosmetic update or a simple recompile; it is a fundamental architectural shift necessitated by the evolution of GPU hardware. Put simply: if you want to use the latest NVIDIA RTX 50-series Blackwell GPUs for password recovery, you can no longer use 32-bit code.</p>
<p>Here is why we did it, why it took so long, and why it matters for your forensic lab.</p>
<h2>The introduction and evolution of hardware acceleration</h2>
<p>For years, the highest performance in password recovery could be achieved by utilizing SIMD (Single Instruction Multiple Data) hardware &#8211; ranging from professional accelerators to consumer video cards powered by AMD and NVIDIA GPUs. In early 2007, we developed a method to accelerate password recovery using GPU hardware, a technology that would forever change password recovery. At a time when GPUs were thought of purely as graphics engines, Elcomsoft engineers found a way to pair them with CPUs to dramatically accelerate cryptographic calculations. This discovery came just as NVIDIA officially released CUDA &#8211; a parallel computing platform that allows software to offload computations onto GPU hardware. It was the first toolkit to open GPUs to general-purpose computing. Together these developments opened the door to much faster attacks on consumer hardware.</p>
<p>Since then, we&#8217;ve been using CUDA exclusively to work with NVIDIA GPUs. Back then, the entire code base was 32-bit; 64-bit compute on consumer hardware was largely unheard of. Eventually, 64-bit architecture started gaining traction in the consumer space, but we continued developing Distributed Password Recovery in 32-bit. Why? Because it just worked, and we didn&#8217;t see a real benefit in switching to a different code base. Spoiler: we still don&#8217;t. 64-bit instructions don&#8217;t magically accelerate things that don&#8217;t require huge memory pages &#8211; and password recovery is not one of the things that benefits from 64-bit instructions directly. Yet, the indirect benefit is clear.</p>
<p>NVIDIA has been signaling the exit from 32-bit compute for a long time. They deprecated 32-bit x86 CUDA support back in 2018 with CUDA 10, and removed it in CUDA 11. However, the real &#8220;hard stop&#8221; has arrived with the Blackwell architecture (RTX 50 series), which requires CUDA 12.8 or newer. These new GPUs and the accompanying CUDA 12.8 toolkit have dropped support for 32-bit compute applications entirely. You cannot run a 32-bit CUDA kernel on a Blackwell GPU; the driver simply won&#8217;t allow it.</p>
<p>For a long time, we maintained EDPR as a 32-bit application because it &#8220;just worked&#8221; on the hardware available at the time (Ampere, Ada Lovelace). But with the arrival of Blackwell, we faced a binary choice: stay 32-bit and lose support for all future NVIDIA hardware, or rewrite the entire engine for 64-bit. We chose the latter.</p>
<h2>Under the hood: the EDPR architecture</h2>
<p>To understand the scale of this migration, you have to look at how Elcomsoft Distributed Password Recovery is built. The system consists of three distinct components:</p>
<ul>
<li><strong>The Server</strong>: This acts as the command center. It manages the queue, breaks password recovery jobs into manageable chunks, and distributes them to available clients.</li>
<li><strong>The Console</strong>: The GUI where the investigator sets up attacks, configures masks, and monitors progress.</li>
<li><strong>The Agents</strong>: These are the workers. They run on the network workstations (potentially including the local machine), receive the chunks from the server, perform the actual brute-force or dictionary attacks, and report the results back.</li>
</ul>
<p>The Agent is where the heavy lifting happens, and crucially, where the GPU acceleration lives. This is where we hit the complexity.</p>
<h2>The plugin problem</h2>
<p>EDPR does not use a monolithic engine for all file formats. Instead, it uses a plugin architecture. We have over 160 discrete plugins, each designed to handle a specific data format &#8211; whether it’s a ZIP archive, a RAR5 file, a BitLocker volume, or an iOS backup.</p>
<p>Each of these plugins is highly optimized. In password recovery, efficiency is everything; a 5% drop in speed can mean adding days to a recovery job. To achieve maximum throughput, many of these plugins were written with heavy use of inline assembler and low-level optimizations specifically tuned for 32-bit registers and instruction sets.</p>
<p>When NVIDIA removed 32-bit support, we couldn&#8217;t just hit &#8220;Recompile&#8221; in Visual Studio and target x64. The inline assembler code simply does not port over. The 64-bit architecture brings more registers and a different calling convention, but it also invalidates the decades of hand-tuned 32-bit assembly we had relied on.</p>
<h2>The migration struggle</h2>
<p>We started working on the 64-bit port immediately after NVIDIA announced the end of 32-bit support, but the process was grueling. We had to take each of those 160+ plugins, strip out the 32-bit assembly, and rewrite the computational kernels for 64-bit.</p>
<p>This introduced two major challenges:</p>
<ul>
<li><strong>Rewriting</strong>: Writing optimized 64-bit code from scratch for hundreds of algorithms is time-consuming.</li>
<li><strong>Regression testing</strong>: In some cases, the initial 64-bit ports were actually slower than their 32-bit predecessors. We had to spend months profiling and re-optimizing specific plugins to ensure that the move to 64-bit didn&#8217;t result in a performance penalty on older hardware. Even today, some 64-bit plugins still show regression compared to their 32-bit counterparts as direct code migration just does not give the same level of optimization.</li>
</ul>
<p>There were moments where we saw regressions on specific hash types because the 64-bit compiler optimizations didn&#8217;t behave exactly as our hand-tuned 32-bit ASM did. We had to manually intervene to bring the speed back up. That takes time. A lot of time.</p>
<h2>The result: ready for RTX 5090</h2>
<p>The result of this refactoring is the new 64-bit build of Elcomsoft Distributed Password Recovery. By moving the code base to 64-bit, we have unlocked native support for the latest CUDA and the NVIDIA Blackwell architecture.</p>
<p>If you secure an NVIDIA GeForce RTX 5090 or 5080 for your lab, you can now utilize it fully with EDPR. The new code can communicate directly with the latest drivers, utilizing the massive parallel throughput of the new cards. This update also future-proofs the tool. With Intel and AMD compute frameworks also being 64-bit only, EDPR is now aligned with the entire GPU acceleration ecosystem.</p>
<p>We are currently running extensive benchmarks comparing the RTX 4090 vs the RTX 5080 and RTX 5090 on this new build. Initial results show significant gains in high-iteration formats like VeraCrypt and specialized hash types. We will publish those numbers in a follow-up post.</p>
<p>Finally, it is worth emphasizing that no matter how well we optimize our code or how powerful the hardware becomes &#8211; even with thousands of high-end GPUs at your disposal &#8211; certain modern encryption algorithms simply cannot be defeated by brute force alone. In these cases, a more targeted approach is required. This begins with forensic triage using tools like <a href="https://www.elcomsoft.com/eqt.html">Elcomsoft Quick Triage</a>, which can instantly aggregate saved credentials from a live system to find &#8220;low-hanging fruit.&#8221; Beyond that, successful recovery often depends on building a detailed suspect profile to move from &#8220;cold&#8221; attacks to &#8220;smart&#8221; ones. By leveraging personal data to create targeted dictionaries, applying complex masks, and using rule-based mutations, investigators can focus their computational power on the most likely password candidates rather than wasting cycles on a mathematically impossible search.</p>
<h2>Benchmarks</h2>
<p><img fetchpriority="high" decoding="async" class="aligncenter size-large wp-image-12768" src="https://blog.elcomsoft.com/wp-content/uploads/2026/01/Microsoft-Office-365-1024x305.png" alt="" width="1024" height="305" srcset="https://blog.elcomsoft.com/wp-content/uploads/2026/01/Microsoft-Office-365-1024x305.png 1024w, https://blog.elcomsoft.com/wp-content/uploads/2026/01/Microsoft-Office-365-550x164.png 550w, https://blog.elcomsoft.com/wp-content/uploads/2026/01/Microsoft-Office-365-768x229.png 768w, https://blog.elcomsoft.com/wp-content/uploads/2026/01/Microsoft-Office-365.png 1200w" sizes="(max-width: 1024px) 100vw, 1024px" /> <img decoding="async" class="aligncenter size-large wp-image-12769" src="https://blog.elcomsoft.com/wp-content/uploads/2026/01/RAR-3-4-1024x305.png" alt="" width="1024" height="305" srcset="https://blog.elcomsoft.com/wp-content/uploads/2026/01/RAR-3-4-1024x305.png 1024w, https://blog.elcomsoft.com/wp-content/uploads/2026/01/RAR-3-4-550x164.png 550w, https://blog.elcomsoft.com/wp-content/uploads/2026/01/RAR-3-4-768x229.png 768w, https://blog.elcomsoft.com/wp-content/uploads/2026/01/RAR-3-4.png 1200w" sizes="(max-width: 1024px) 100vw, 1024px" /> <img decoding="async" class="aligncenter size-large wp-image-12868" src="https://blog.elcomsoft.com/wp-content/uploads/2026/02/ZIP-AES-256-1024x306.png" alt="" width="1024" height="306" srcset="https://blog.elcomsoft.com/wp-content/uploads/2026/02/ZIP-AES-256-1024x306.png 1024w, https://blog.elcomsoft.com/wp-content/uploads/2026/02/ZIP-AES-256-550x164.png 550w, https://blog.elcomsoft.com/wp-content/uploads/2026/02/ZIP-AES-256-768x229.png 768w, https://blog.elcomsoft.com/wp-content/uploads/2026/02/ZIP-AES-256.png 1199w" sizes="(max-width: 1024px) 100vw, 1024px" /> <img loading="lazy" decoding="async" class="aligncenter size-large wp-image-12869" src="https://blog.elcomsoft.com/wp-content/uploads/2026/02/SHA-256-1024x273.png" alt="" width="1024" height="273" srcset="https://blog.elcomsoft.com/wp-content/uploads/2026/02/SHA-256-1024x273.png 1024w, https://blog.elcomsoft.com/wp-content/uploads/2026/02/SHA-256-550x147.png 550w, https://blog.elcomsoft.com/wp-content/uploads/2026/02/SHA-256-768x205.png 768w, https://blog.elcomsoft.com/wp-content/uploads/2026/02/SHA-256.png 1200w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></p>
</div>]]></content:encoded>
					
		
		
		<enclosure url="https://blog.elcomsoft.com/wp-content/uploads/2026/01/EDPR64-1200x630-1.png" length="980193" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/01/EDPR64-1200x630-1.png" width="1200" height="630" medium="image" type="image/jpeg">
	<media:copyright>ElcomSoft blog</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/01/EDPR64-1200x630-1.png" width="1200" height="630" />
	</item>
		<item>
		<title>Looks Can Lie: Is That Really an NVMe Drive?</title>
		<link>https://blog.elcomsoft.com/2026/03/looks-can-lie-is-that-really-an-nvme-drive/</link>
		
		<dc:creator><![CDATA[Oleg Afonin]]></dc:creator>
		<pubDate>Tue, 17 Mar 2026 12:42:01 +0000</pubDate>
				<category><![CDATA[General]]></category>
		<category><![CDATA[NVMe]]></category>
		<guid isPermaLink="false">https://blog.elcomsoft.com/?p=12986</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/nvme-blog.jpg" width="1200" height="630" title="" alt="" /></div><div>Many storage devices and adapter boards look alike. When holding a module with a connector that looks suspiciously like the M.2, how do you know exactly what you are dealing with? Is that M.2 board a SATA drive, a fast NVMe device or a Wi-Fi/Bluetooth combo? Will a drive removed from an Apple computer work [&#8230;]</div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/nvme-blog.jpg" width="1200" height="630" title="" alt="" /></div><div><p>Many storage devices and adapter boards look alike. When holding a module with a connector that looks suspiciously like the M.2, how do you know exactly what you are dealing with? Is that M.2 board a SATA drive, a fast NVMe device or a Wi-Fi/Bluetooth combo? Will a drive removed from an Apple computer work in a simple mechanical adapter, or will it require the original Apple device to access? A physical connector does not guarantee the underlying technology.</p>
<p>While the NVMe protocol powers everything from standard consumer M.2 slots to enterprise U.2 and EDSFF devices, the hardware world is full of deceptive lookalikes. For example, the proprietary raw NAND modules found in recent Apple hardware might look like standard NVMe disks, but they are completely unreadable outside their host system because they lack an onboard controller. In this article, we will break down the physical form factors that actually host true NVMe storage and help you spot the proprietary lookalikes.</p>
<h2>NVMe Form Factor &amp; Protocol Support Table</h2>
<p>Let us start with a table.</p>
<table style="width: 100%; border-collapse: collapse; text-align: left;">
<thead>
<tr style="border-bottom: 2px solid #333;">
<th style="padding: 10px;">Form Factor</th>
<th style="padding: 10px;">Interface/Transport</th>
<th style="padding: 10px;">Notes</th>
</tr>
</thead>
<tbody>
<tr style="border-bottom: 1px solid #ddd;">
<td style="padding: 10px;"><strong>M.2 (Standard M-Key / B-Key)</strong></td>
<td style="padding: 10px;">NVMe, PCIe, SATA</td>
<td style="padding: 10px;">The universal consumer standard. M-Key is standard for PCIe/NVMe SSDs, while B-Key is often used for SATA or cellular modems.</td>
</tr>
<tr style="border-bottom: 1px solid #ddd;">
<td style="padding: 10px;"><strong>M.2 (Key E / Key A+E)</strong></td>
<td style="padding: 10px;">PCIe/USB (typically Wi-Fi/BT; rare NVMe use)</td>
<td style="padding: 10px;">Traditionally for Wi-Fi/Bluetooth. Natively supports PCIe, allowing niche NVMe SSDs to operate over the interface (yet this is not the intended purpose).</td>
</tr>
<tr style="border-bottom: 1px solid #ddd;">
<td style="padding: 10px;"><strong>Apple Proprietary (Old: 2013–2019)</strong></td>
<td style="padding: 10px;">NVMe, PCIe (AHCI)</td>
<td style="padding: 10px;">Varies by model year and connector generation. Proprietary blade connectors. Early iterations (pre-2015) utilized PCIe AHCI; later models adopted true NVMe.</td>
</tr>
<tr style="border-bottom: 1px solid #ddd;">
<td style="padding: 10px;"><strong>Apple Proprietary (New: M-Series Mac Mini/Studio)</strong></td>
<td style="padding: 10px;"><strong>&#8211;</strong> (Raw NAND only)</td>
<td style="padding: 10px;">Removable storage boards that do not contain storage controllers. Will typically not work outside of Apple hardware.</td>
</tr>
<tr style="border-bottom: 1px solid #ddd;">
<td style="padding: 10px;"><strong>HHHL (PCI-e Add-in Card)</strong></td>
<td style="padding: 10px;">NVMe, PCIe</td>
<td style="padding: 10px;">Half-Height, Half-Length form factor. Plugs directly into standard motherboard PCIe expansion slots.</td>
</tr>
<tr style="border-bottom: 1px solid #ddd;">
<td style="padding: 10px;"><strong>U.2 / U.3 (SFF-8639)</strong></td>
<td style="padding: 10px;">NVMe, PCIe, SAS, SATA</td>
<td style="padding: 10px;">The traditional enterprise standard. U.3 refines the connector for &#8220;tri-mode&#8221; backplanes (NVMe, SAS, SATA interchangeability).</td>
</tr>
<tr style="border-bottom: 1px solid #ddd;">
<td style="padding: 10px;"><strong>EDSFF (E1, E3, etc.)</strong></td>
<td style="padding: 10px;">NVMe, PCIe, CXL</td>
<td style="padding: 10px;">EDSFF SSDs are NVMe over PCIe, while the connector family is also being used for other device classes, including Compute Express Link (CXL) devices. The modern data center standard.</td>
</tr>
<tr style="border-bottom: 1px solid #ddd;">
<td style="padding: 10px;"><strong>Samsung NF1 (NGSFF)</strong></td>
<td style="padding: 10px;">NVMe, PCIe</td>
<td style="padding: 10px;">Early, proprietary high-density server format introduced by Samsung. Strictly utilized NVMe/PCIe for bandwidth. Considered legacy at this point.</td>
</tr>
</tbody>
</table>
<p>&nbsp;</p>
<h3>Technical Notes</h3>
<p><strong>The M.2 form factor</strong> is the de facto standard for consumer drives. However, an M.2 board can host a variety of hardware and support a range of protocols, visually identifiable by the configuration of physical cutouts (the &#8220;keys&#8221;). Below is an image of a typical M.2 Key E module.</p>
<p><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-13008" src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/m2_key_e.jpg" alt="" width="493" height="593" srcset="https://blog.elcomsoft.com/wp-content/uploads/2026/03/m2_key_e.jpg 493w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/m2_key_e-457x550.jpg 457w" sizes="auto, (max-width: 493px) 100vw, 493px" /></p>
<p><strong>The Apple 2013-2019 connectors.</strong> Apple SSDs from the 2013-2019 period use multiple proprietary connector types, most commonly 12+16-pin and later 22+34-pin designs. However, these connectors do not map cleanly to specific product lines or years, and both types may appear across overlapping generations. Earlier 12+16-pin modules were typically PCIe AHCI-based, while later designs transitioned to NVMe; connector type alone is not sufficient to determine protocol support. The newer 22+34-pin connector is associated with later NVMe-based SSDs. Exact model-year compatibility must always be verified; refer to <a href="https://beetstech.com/blog/apple-proprietary-ssd-ultimate-guide-to-specs-and-upgrades">Apple Proprietary SSDs: Ultimate Guide to Specs &amp; Upgrades | BeetsBlog</a> for details. 22+34 pin (top) and 12+16 pin (bottom) modules:</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-13009 size-medium" src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/apple_22_34-550x344.jpg" alt="" width="550" height="344" srcset="https://blog.elcomsoft.com/wp-content/uploads/2026/03/apple_22_34-550x344.jpg 550w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/apple_22_34-1024x641.jpg 1024w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/apple_22_34-768x481.jpg 768w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/apple_22_34-1536x962.jpg 1536w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/apple_22_34.jpg 1600w" sizes="auto, (max-width: 550px) 100vw, 550px" /> <img loading="lazy" decoding="async" class="aligncenter size-large wp-image-13010" src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/apple_12_16.png" alt="" width="659" height="268" srcset="https://blog.elcomsoft.com/wp-content/uploads/2026/03/apple_12_16.png 659w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/apple_12_16-550x224.png 550w" sizes="auto, (max-width: 659px) 100vw, 659px" /></p>
<p><strong>The Mac Mini &#8220;SSD&#8221;.</strong> The removable storage modules in modern Apple Silicon hardware (like the M4 Mac Mini or Mac Studio) are not NVMe drives, despite physically resembling a stubby M.2 SSD. They consist entirely of raw NAND flash chips. The actual NVMe storage controller is integrated directly into the Apple M-series SoC, meaning the physical connector only transmits raw NAND signals. As a result, these modules will not &#8220;work&#8221; outside of Apple hardware, while direct readouts will be likely fruitless due to encryption. A Mac Mini (2024) storage module (courtesy of <a href="https://www.ifixit.com/Guide/How+to+Replace+the+SSD+in+your+Mac+mini+(2024)/180199">iFixit</a>):</p>
<p><img loading="lazy" decoding="async" class="aligncenter size-large wp-image-13012" src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/mac-mini-1024x768.jpeg" alt="" width="1024" height="768" srcset="https://blog.elcomsoft.com/wp-content/uploads/2026/03/mac-mini-1024x768.jpeg 1024w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/mac-mini-550x413.jpeg 550w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/mac-mini-768x576.jpeg 768w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/mac-mini-1536x1152.jpeg 1536w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/mac-mini.jpeg 1600w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></p>
<p><strong>Samsung NF1 (formerly NGSFF).</strong> Designed as a &#8220;wider M.2&#8221; (30.5mm x 110mm) to accommodate multiple rows of high-capacity NAND chips for 1U servers. While it introduced enterprise features like hot-swapping and dual-port support, it failed to gain broad industry support. It is now considered a legacy format that never achieved the reach of the universally adopted EDSFF standard.</p>
<p><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-13013" src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/n1.png" alt="" width="720" height="220" srcset="https://blog.elcomsoft.com/wp-content/uploads/2026/03/n1.png 720w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/n1-550x168.png 550w" sizes="auto, (max-width: 720px) 100vw, 720px" /></p>
<p><strong>Key A+E NVMe Drives.</strong> The A+E keyed connectors were originally designed to host Wi-Fi/Bluetooth adapters. They are commonly used in ultrabooks, NUC PCs, and many desktop computer boards, typically featuring one or two PCIe lanes. Niche, third-party M.2 Key A+E NVMe drives exist, providing a hardware solution for adding a dedicated NVMe drive to space-constrained systems (like mini-PCs, homelab routers, or NAS builds). They have severe bandwidth limitations as standard M-Key slot provides 4 lanes of PCIe (x4), whereas an A+E slot typically provisions only PCIe x1 or x2. In addition, system firmware compatibility can be inconsistent. Some older environments may employ hardware whitelisting (restricting the slot to specific Wi-Fi modules) or lack the necessary NVMe drivers to boot the OS from the A+E interface. Yet, the existence of such storage devices proves the point: you cannot assume that a given board is a Wi-Fi/Bluetooth combo just because it is keyed A+E.</p>
<p><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-13014" src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/m2_key_ae.png" alt="" width="501" height="501" srcset="https://blog.elcomsoft.com/wp-content/uploads/2026/03/m2_key_ae.png 501w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/m2_key_ae-150x150.png 150w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/m2_key_ae-24x24.png 24w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/m2_key_ae-48x48.png 48w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/m2_key_ae-96x96.png 96w, https://blog.elcomsoft.com/wp-content/uploads/2026/03/m2_key_ae-300x300.png 300w" sizes="auto, (max-width: 501px) 100vw, 501px" /></p>
<h2>Conclusion</h2>
<p>Things are not always what they look like. An M.2 board can be a storage device or a Bluetooth adapter, and a Wi-Fi-card slot can host an NVMe drive. The &#8220;22+34 pin&#8221; connectors are actually &#8220;12+16&#8221;; they can be either PCIe AHCI or NVMe, and will typically work in a third-party adapter. Newer Apple storage boards look like a stubby M.2 SSD &#8211; but they aren&#8217;t. Instead, they just host raw NAND chips, making the data entirely inaccessible outside of the host Apple device.</p>
</div>]]></content:encoded>
					
		
		
		<enclosure url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/nvme-blog.jpg" length="240396" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/nvme-blog.jpg" width="1200" height="630" medium="image" type="image/jpeg">
	<media:copyright>ElcomSoft blog</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/nvme-blog.jpg" width="1200" height="630" />
	</item>
		<item>
		<title>Android Pre-Installed Apps: What Could Possibly Go Wrong?</title>
		<link>https://blog.elcomsoft.com/2026/03/android-pre-installed-apps-what-could-possibly-go-wrong/</link>
		
		<dc:creator><![CDATA[Oleg Afonin]]></dc:creator>
		<pubDate>Fri, 13 Mar 2026 11:19:32 +0000</pubDate>
				<category><![CDATA[General]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Friday]]></category>
		<category><![CDATA[Friday primer]]></category>
		<guid isPermaLink="false">https://blog.elcomsoft.com/?p=12950</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/android-software-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div>Picture this: you just dropped $1,300 on a brand-new, top-of-the-line Android flagship. You unbox it, peel off the plastic film, boot it up, and get ready for the daily grind. But before you can even sync your contacts, you notice the app drawer is already cluttered with unsolicited apps. If you think this is a problem [&#8230;]</div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/android-software-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div><p>Picture this: you just dropped $1,300 on a brand-new, top-of-the-line Android flagship. You unbox it, peel off the plastic film, boot it up, and get ready for the daily grind. But before you can even sync your contacts, you notice the app drawer is already cluttered with unsolicited apps. If you think this is a problem exclusive to fifty-dollar burner phones bought at a gas station or cheap Chinese handsets obtained from an online shopping site, think again. We&#8217;ve seen this corporate hoarding disease infect even the highest tiers. Just look at the new Samsung Galaxy S26 Ultra; a clean setup of a 512GB model immediately <a href="https://www.androidauthority.com/samsung-galaxy-s26-bloatware-preinstalled-apps-3647528/">sacrifices over 40GB</a> to system files and third-party apps you never asked for. To be clear, you get zero say in the matter &#8211; they are pre-installed without a single prompt. You pay top dollar for premium hardware, and the manufacturer still treats your device like a subsidized billboard.</p>
<p>But high-profile annoyances from giants like Meta and Microsoft are just the visible tip of the iceberg. The real problem is hidden slightly deeper in the supply chain: the pre-installed &#8220;utility&#8221; apps.</p>
<p>We’re talking about system cleaners, alternative keyboards, memory optimizers, and smart remotes. Why are they baked into your firmware? Pure, shortsighted corporate greed. Hardware manufacturers willingly partner with third-party software vendors, trading their brand reputation and your device&#8217;s attack surface for a negligible pre-installation bounty. OEMs actively compromise the integrity of millions of devices for fractions of a cent per unit.</p>
<p>To scrape together pennies on the production margin, manufacturers grant these dubious vendors the absolute holy grail of Android persistence. By baking these &#8220;utilities&#8221; directly into the factory image, the OEM implicitly vouches for them. They fly below the radar of your due diligence and bypass the standard user-consent gauntlet, inheriting deep system access and unshakeable persistence simply by existing on the device right out of the box.</p>
<p>At this point, there are literally millions of devices running a highly privileged, un-deletable piece of third-party code fully controlled by an external vendor who eventually needs to show a return on investment.</p>
<p>What could possibly go wrong?</p>
<h2>The &#8220;Trusted&#8221; Partition</h2>
<p>This is where Android’s security model meets the timeless urge to squeeze one more coin out of the hardware. When an OEM cuts a deal with a third-party software vendor, the app (or its stub) gets baked straight into the firmware. Usually that means /system/app; if the vendor needs deeper integration, maybe even /system/priv-app.</p>
<p>And that matters. A pre-installed app is not just another app. It starts life inside the factory image, wrapped in OEM trust, and in some cases gets privileges that ordinary user-installed software can only dream about. Not because it earned them, of course. Just because someone in a meeting decided the pre-install check was worth more than the long-term headache.</p>
<p>To save space, manufacturers sometimes do not preload the full app. They ship a stub: a placeholder APK with an icon, some metadata, and a digital signature. Later, during setup, that stub pulls down the payload from Google Play or the OEM’s own app store. Depending on how the package is signed and allowlisted, the downloaded version can continue enjoying some of the trust and privileges attached to the factory stub.</p>
<h2>The Update Hijack Mechanism</h2>
<p>So what happens after the phone leaves the factory? At shipment time, the pre-installed app may be harmless and polished just enough to survive the OEM’s review. Months later, that same vendor still controls the signing keys. So they can ship an update through Google Play, and Android will accept it as the same app as long as the signature checks out &#8211; meaning that at any point in the future, that developer can push an updated APK through the Google Play Store. The modified app may then go on with the same permissions, trust, and privileged treatment already granted by the OEM. In practice, it means that the OEM&#8217;s initial security review becomes a lot less reassuring, effectively turning a one-time decision into a long-term channel for shipping arbitrary privileged payload onto the user&#8217;s device.</p>
<h2>The Metamorphosis: From Useful Tool to Exit Strategy</h2>
<p>The life cycle of a pre-installed Android utility is depressingly predictable. It usually starts with a genuinely useful tool, a massive user base, and a lucrative OEM contract. But eventually, the native OS simply gets good enough to make third-party memory cleaners and IR remotes completely obsolete. Faced with an existential crisis and investors demanding a return, the developers become tempted to execute their exit strategy &#8211; which, in many cases, is a pivot toward aggressive monetization. At that exact moment, the phone they were originally supposed to optimize stops being the client and becomes the product &#8211; just raw material to be aggressively mined for lock-screen ad impressions, background data collection, and click-fraud revenue. Here is what that downward spiral looks like in practice.</p>
<h3>ES File Explorer: from indispensable tool to unauthenticated LAN honeypot</h3>
<p>I genuinely used to use this application back in the day, but the tragic arc of DO Global&#8217;s ES File Explorer is a masterclass in burning user goodwill. Facing pressure to monetize a massive base of over 100 million active installations, the developers pushed an update introducing a &#8220;Smart Charge&#8221; feature that forcefully hijacked the user&#8217;s lock screen whenever the device was plugged in, replacing it with deceptive metrics and intrusive advertisements. As if aggressively rendering devices unusable wasn&#8217;t enough, the monetization drive ultimately compromised the app&#8217;s foundational security architecture. As <a href="https://www.bleepingcomputer.com/news/security/es-file-explorer-flaws-put-100-million-users-data-at-risk-fix-promised/">BleepingComputer reported</a>, the developers quietly spun up a completely unauthenticated, hidden HTTP web server on local TCP port 59777 (CVE-2019-6447) every time the application launched. Because there were zero authentication mechanisms, anyone connected to the same public Wi-Fi network could seamlessly interface with the app to map the victim&#8217;s device, silently extract personal files, and remotely execute commands.</p>
<h3>Clean Master &amp; QuickPic: from device optimizer to silent click-fraud syndicate</h3>
<p>Cheetah Mobile achieved a healthy market share by cutting deals with multiple OEMs to embed its &#8220;Clean Master&#8221; utility into factory firmware, but behind the facade of device optimization, it was fundamentally engineered as a data monetization entity. Instead of clearing system caches, the updated applications operated silently in the background, utilizing their expansive system permissions to run a massive advertising fraud scheme that injected fake synthetic &#8220;clicks&#8221; to fraudulently siphon millions in referral bounties. The syndicate later expanded their software portfolio; they famously acquired the beloved, lightweight gallery app QuickPic, solely to strip-mine its fiercely loyal user base. As exposed in this <a href="https://www.androidpolice.com/2018/11/28/evidence-points-to-cheetah-mobile-and-kika-tech-engaging-in-massive-click-fraud-scheme/">Android Police investigation</a>, Cheetah Mobile pushed updates that instantly destroyed QuickPic&#8217;s core value. The end result? QuickPic was swept up in the broader Google ban of Cheetah Mobile products in 2018, though brief, bug-ridden re-releases attempted to bypass the block.</p>
<h3>TouchPal: from swipe-typing pioneer to an adware nightmare</h3>
<p>A third-party keyboard inherently requires an absolute, unshakeable trust between the hardware manufacturer and the software vendor, simply because it possesses the capability to intercept every single password and private communication entered by the user. CooTek&#8217;s TouchPal keyboard thoroughly violated this trust after securing pre-installation agreements on premium devices from over 50 manufacturers, including HTC. The developers secretly integrated a heavily obfuscated advertising plugin known as &#8220;BeiTaAd,&#8221; an absolute masterpiece of malicious engineering designed to aggressively evade automated security scanners. As the original <a href="https://www.lookout.com/threat-intelligence/article/beitaplugin-adware">Lookout threat research</a> detailed, the plugin would lie completely dormant for 24 hours to two weeks after an update. It would then wake up to forcefully hijack the lock screen and trigger incredibly loud out-of-app audio and video advertisements, rendering the mobile device nearly unusable even while asleep in the user&#8217;s pocket.</p>
<h3>Peel Smart Remote: from hardware companion to panicked adware engine</h3>
<p>Peel Smart Remote started out as a genuinely useful IR remote control app, deeply integrated into the firmware of millions of Android phones and effectively turning them into universal remotes through their built-in IR blasters. Then the smartphone industry moved on. Manufacturers started dropping IR hardware, Peel’s original value proposition collapsed, and management responded with the sort of quiet panic that usually produces terrible product decisions. The updated versions turned the once-useful utility into an intrusive adware engine, using its pre-installed foothold to <a href="https://www.jamf.com/blog/previously-preinstalled-peel-tv-remote-app-is-behaving-like-adware/">overlay advertisements</a> on the lock screen and disrupt normal device use. As <a href="https://www.androidpolice.com/2017/03/29/peel-remote-app-upsets-users-ton-ads-lock-screen-overlays/">Android Police</a> noted, the app began displaying full-screen ads, opening random spam websites, and modifying the lock screen without making it especially obvious what was causing the mess. In other words, a classic case of a pre-installed utility degenerating into nuisanceware once the original business stopped making sense.</p>
<h2>The True Cost of &#8220;Free&#8221; Software</h2>
<p>At the end of the day, this isn&#8217;t a story about genius-level hackers pulling off the heist of the century. It’s just a boring, predictable failure of corporate economics &#8211; a classic &#8220;what could possibly go wrong&#8221; scenario. When user consent gets tossed out on the factory assembly line, the whole Android security model falls apart. You can&#8217;t even rely on automated scanners like Google Play Protect to save you. They are built to spot rogue malware trying to break in, not to evict a &#8220;trusted&#8221; partner who was explicitly handed the keys to the vault by the manufacturer. Trading hardware integrity for a fraction of a cent per unit is simply bad business.</p>
<p>The root cause here isn&#8217;t broken code or inherent design flaws; it&#8217;s just your usual, boring corporate greed. Hardware manufacturers have perfected the trick. They pocket the upfront cash, cement the third-party clutter into an undeletable system folder, and then conveniently look the other way when that same app pivots to click-fraud. The manufacturer gets to plead ignorance, the app developer cashes out, and you are left holding a hijacked phone. You might have bought the phone, but make no mistake: you are still the product.</p>
</div>]]></content:encoded>
					
		
		
		<enclosure url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/android-software-1200x630-1.png" length="755300" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/android-software-1200x630-1.png" width="1200" height="630" medium="image" type="image/jpeg">
	<media:copyright>ElcomSoft blog</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/android-software-1200x630-1.png" width="1200" height="630" />
	</item>
		<item>
		<title>The C:\User Data in Windows Forensics</title>
		<link>https://blog.elcomsoft.com/2026/03/the-cuser-data-in-windows-forensics/</link>
		
		<dc:creator><![CDATA[Oleg Afonin]]></dc:creator>
		<pubDate>Thu, 12 Mar 2026 11:00:22 +0000</pubDate>
				<category><![CDATA[General]]></category>
		<category><![CDATA[EQT]]></category>
		<category><![CDATA[Windows]]></category>
		<guid isPermaLink="false">https://blog.elcomsoft.com/?p=12934</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/terminal-3-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div>This article concludes our series on Windows forensic artefacts and the role they play in real-world investigations. Over the past several weeks, we looked at evidence sources that help investigators understand activity at the system level, from Windows Event Logs and the Windows Registry to file system traces stored under C:\Windows and C:\ProgramData. Those artefacts [&#8230;]</div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/terminal-3-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div><p>This article concludes our series on Windows forensic artefacts and the role they play in real-world investigations. Over the past several weeks, we looked at evidence sources that help investigators understand activity at the system level, from <a href="https://blog.elcomsoft.com/2026/02/forensic-analysis-of-windows-10-and-11-event-logs/">Windows Event Logs</a> and the <a href="https://blog.elcomsoft.com/2026/02/investigating-windows-registry/">Windows Registry</a> to file system traces stored under <a href="https://blog.elcomsoft.com/2026/03/investigating-windows-file-system-artifacts-under-cwindows/">C:\Windows</a> and <a href="https://blog.elcomsoft.com/2026/03/windows-file-system-artefacts-under-cprogramdata/">C:\ProgramData</a>. Those artefacts are indispensable when reconstructing the broader picture: system startup and shutdown, service activity, software installation, persistence mechanisms, and signs of compromise affecting the machine as a whole. Yet system-wide telemetry has an obvious limitation. It can tell us that something happened, but not always who was behind it. This is where the focus shifts from the operating system to the individual user.</p>
<p>Modern Windows systems are designed to isolate user environments from the core OS. Documents, downloads, application caches, cloud sync data, shortcuts, thumbnails, temporary files, and countless other traces of day-to-day activity are pushed into the user profile. In that sense, <code>C:\Users\&lt;username&gt;</code> is perhaps the closest thing Windows has to a behavioral map of a specific person using that system.</p>
<p>In this final installment, we move away from system-level evidence and into user-specific artefacts stored under the profile directory. This is where attribution becomes more precise, where ordinary folders such as Desktop, Downloads, and Documents contain data created or acquired by the user, and where hidden application data can reveal what a user opened, downloaded, edited, synchronized, or tried to remove.</p>
<h2><code>C:\Users</code> vs. <code>%USERPROFILE%</code></h2>
<p>Before looking at specific artefacts, let&#8217;s make one practical distinction: user profile data is accessed differently on a live system and in offline analysis. The <code>C:\Users</code> directory is the physical, hardcoded root for user profiles on a standard Windows installation. In forensic work, that path is most often associated with dead-box analysis. Once an examiner mounts a forensic disk image, this static directory offers a complete view of every profile stored on the system, not just the account that was active at the moment the machine was seized. That wider view matters in real cases, where disabled accounts, long-forgotten profiles, or attacker-created local users may still hold useful evidence.</p>
<p><code>%USERPROFILE%</code>, on the other hand, is a live environment variable that resolves to the home directory of the user currently logged into that session, which makes it useful during live system analysis and rapid triage. Investigators can use <code>%USERPROFILE%</code> to point scripts and collection tools at the active account. In practice, that makes it a natural fit for PowerShell scripts and other automated workflows.</p>
<p>Although this article focuses on file system artefacts, we have to mention the user’s registry hives. The two key files are <code>%USERPROFILE%\NTUSER.DAT</code> and <code>%USERPROFILE%\AppData\Local\Microsoft\Windows\UsrClass.dat</code>. The first is the main per-user hive, storing user-specific settings, execution traces, and file access history. The second contains valuable shell-related artefacts such as ShellBags and MUICache entries, and is particularly important on modern Windows systems. We will leave those hives aside here to keep the focus on file system evidence, but they should always be acquired together with the rest of the profile. For a deeper discussion, see our earlier article on the <a href="https://blog.elcomsoft.com/2026/02/investigating-windows-registry/">Windows Registry</a>.</p>
<p><strong>Caveat:</strong> <code>C:\Users</code> is only the default profile root and must not be assumed in every installation. The system-wide base path for user profiles is recorded in the Registry at <code>HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList\ProfilesDirectory</code>, while a specific user’s actual profile path (<code>%USERPROFILE%</code>) is defined per SID at <code>HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList\&lt;SID&gt;\ProfileImagePath</code>. In other words, profiles may be redirected to another volume, so analysts should verify the registered profile paths instead of relying on the presence of a <code>C:\Users</code> folder alone.</p>
<h2>Predefined User Folders and Native Applications</h2>
<p>When Windows creates a user profile, it also creates a set of default folders under <code>C:\Users\&lt;username&gt;</code>. To the user, these are convenient save locations. To an investigator, they are some of the most productive artefacts on the system. Windows, browsers, email clients, Office apps, and built-in tools all write to these folders by default, which makes them a practical record of what the user downloaded, opened, edited, staged, or tried to remove.</p>
<h3>The Downloads folder</h3>
<p><strong>What it is:</strong> Downloads, typically located at <code>%USERPROFILE%\Downloads</code>, is the default save location for files arriving from the Internet, email, messaging apps, and local networks.</p>
<p><strong>Forensic value:</strong> This is often where a compromise first touches disk. The file itself matters, but the hidden metadata can matter even more. On NTFS volumes, Windows may attach a <code>Zone.Identifier</code> alternate data stream, which marks the file as a Web download. That stream can preserve the security zone, referrer URL, and direct host URL of the downloaded file. Even if the payload is later renamed, that metadata may still point to the phishing page, malware host, or delivery server that brought it onto the endpoint. Missing or stripped <code>Zone.Identifier</code> data on executables, scripts, or archives can also be meaningful, as removing the <a href="https://en.wikipedia.org/wiki/Mark_of_the_Web">Mark of the Web</a> is a common way to suppress security warnings. Investigators should also check <code>%USERPROFILE%\Links</code>, which may contain <code>Downloads.lnk</code> pointing back to this folder.</p>
<h3>The Desktop folder</h3>
<p><strong>What it is:</strong> The Desktop is the user’s visible workspace, holding files, folders, and shortcuts shown directly in the shell. Depending on configuration, it may be stored locally at <code>%USERPROFILE%\Desktop</code> or redirected into OneDrive at <code>%USERPROFILE%\OneDrive\Desktop</code>.</p>
<p><strong>Forensic value:</strong> People use the Desktop as a scratchpad, and attackers do too. In insider cases, it is often used to stage files before compression, copying, or exfiltration. In intrusion cases, it commonly holds payloads, scripts, and ransom notes placed where the user will see them immediately. Even when the original file is gone, a leftover <code>.lnk</code> shortcut on the Desktop may still show that the user had direct access to it. A second place worth checking is <code>%USERPROFILE%\Links</code>, which may contain <code>Desktop.lnk</code> pointing to the active Desktop location.</p>
<h3>The Documents folder</h3>
<p><strong>What it is:</strong> Documents, usually stored in <code>%USERPROFILE%\Documents</code>, remains the default save location for Office files, PDFs, text files, exports, and other work product.</p>
<p><strong>Forensic value:</strong> In many cases, this is where the data of interest actually lives. That alone makes it a prime target in theft and ransomware incidents. It can also contain useful secondary traces. One example is the temporary Office lock file, usually prefixed with <code>~$</code>. If one of these files is left behind, it can show that the corresponding document was open and being edited when the system crashed, was shut down abruptly, or the session ended unexpectedly.</p>
<h3>Pictures, Music, and Videos</h3>
<p><strong>What it is:</strong> These are the default Windows libraries for media storage and browsing, typically located at <code>%USERPROFILE%\Pictures</code>, <code>%USERPROFILE%\Music</code>, and <code>%USERPROFILE%\Videos</code>.</p>
<p><strong>Forensic value:</strong> Media folders are easy to skip in enterprise cases, but they can be highly valuable. One reason is <code>Thumbs.db</code>. In older Windows versions, and still in some network-folder scenarios, Windows stores thumbnail previews in hidden <code>Thumbs.db</code> files inside image folders. Those thumbnails may survive after the original files are deleted, giving investigators proof that a given image once existed in that location. Large media folders can also provide cover for hidden payloads or stolen data concealed inside otherwise ordinary-looking files.</p>
<h3>The Recent folder and LNK files</h3>
<p><strong>What it is:</strong> Windows maintains a hidden Recent folder at <code>%APPDATA%\Microsoft\Windows\Recent</code> (physically <code>%USERPROFILE%\AppData\Roaming\Microsoft\Windows\Recent</code>). When a user opens a document, picture, or application, Windows often creates a shortcut file pointing to that target and stores it there.</p>
<p><strong>Forensic value:</strong> These <code>.lnk</code> files are some of the most useful profile artefacts available. They often survive after the original file has been deleted or the USB drive has been disconnected. More importantly, an LNK file is not just a pointer. It can preserve the original target path, target timestamps, and storage-device details such as volume serial number and drive type. That makes Recent especially useful when the goal is to prove file access, not just file presence.</p>
<h3>The Favorites folder</h3>
<p><strong>What it is:</strong> Favorites, typically located at <code>%USERPROFILE%\Favorites</code>, is a legacy shell folder originally used by Internet Explorer to store bookmarked web links and shortcuts.</p>
<p><strong>Forensic value:</strong> While less prominent on modern systems, this folder still appears on many Windows installations and can retain useful historical traces. In older user profiles, it may preserve bookmarked URLs, manually saved shortcuts, or application-created links that help reconstruct browsing habits, user interests, or access to specific internal and external resources. In some cases, its value is less about current activity and more about persistence: artefacts left in Favorites can survive browser changes and remain in the profile long after the original workflow has been abandoned.</p>
<h3>Microsoft OneDrive</h3>
<p><strong>What it is:</strong> OneDrive is built into Windows 10 and 11 as the default cloud sync engine. Its visible sync root is usually <code>%USERPROFILE%\OneDrive</code>, and it supports local files, cloud-only placeholders, and Files On-Demand.</p>
<p><strong>Important:</strong> Partial sync is a real issue, affecting both offline and live system analysis &#8211; just in different ways. We strongly recommend familiarizing yourself with the issue by reading <a href="https://blog.elcomsoft.com/2026/01/the-cloud-gap-forensic-triage-vs-disk-imaging-in-the-age-of-on-demand-sync/">The Cloud Gap: Forensic Triage vs. Disk Imaging in the Age of On-Demand Sync</a>.</p>
<p><strong>Forensic value:</strong> OneDrive extends the user profile beyond the local disk. Investigators should look in three places. The first is the visible sync root, usually <code>%USERPROFILE%\OneDrive</code>, where cloud-only files may appear as placeholders. Even without the full file body on disk, those entries can still show that the user knew about the file and had access to it. The second is <code>%LOCALAPPDATA%\Microsoft\OneDrive\logs</code>, which stores <code>.odl</code> synchronization logs. These can help reconstruct uploads, downloads, renames, deletions, and, in some cases, sharing activity. The third is <code>%LOCALAPPDATA%\Microsoft\OneDrive\settings</code>, where files such as <code>UserCid.dat</code> and <code>SyncEngineDatabase.db</code> can link the local Windows account to a Microsoft identity and expose the structure of synchronized cloud data.</p>
<h3>Windows 11 Notepad</h3>
<p><strong>What it is:</strong> On Windows 11, Notepad is no longer a bare-bones text editor. It now supports session persistence and can restore unsaved tabs.</p>
<p><strong>Forensic value:</strong> That feature leaves a useful artefact behind. Notepad stores active tab contents in binary <code>.bin</code> files under <code>%LOCALAPPDATA%\Packages\Microsoft.WindowsNotepad_8wekyb3d8bbwe\LocalState\TabState</code>. In practice, that means unsaved text may still be recoverable after the user closes the app without saving. Notes, pasted credentials, IP lists, and command fragments can all survive in TabState, turning what used to be volatile user activity into file-system evidence.</p>
<h3>Paint, WordPad, and other lightweight native editors</h3>
<p><strong>What it is:</strong> Tools such as Paint, WordPad, and the legacy Write application are basic editors that are present on many Windows systems by default.</p>
<p><strong>Forensic value:</strong> Their value lies in how often they are used precisely because they are already there. Paint can leave cache-related traces in AppData while an image is being edited, which may help show that a screenshot or graphic was manipulated locally. WordPad and Write can generate temporary files and <code>.lnk</code> traces in the Recent folder when documents are opened or edited. In practice, that may be enough to show that a user viewed or modified a file without ever installing third-party software.</p>
<h2>AppData and Dot-Prefixed Folders</h2>
<p><code>%USERPROFILE%\AppData</code> has the hidden file-system attribute set. The goal is not to conceal it from the user or the examiner, but to reduce visual clutter in File Explorer. This is where Windows and applications store configuration data and other files not intended for routine user access: settings, caches, session state, temporary files, logs, browser data, and similar artefacts generated during normal use. It is also one of the richest sources of user-attributed evidence on the system, regardless of whether the software is built into Windows or installed separately.</p>
<p>AppData is divided into three subfolders with distinct roles: <code>Roaming</code>, <code>Local</code>, and <code>LocalLow</code>. That split reflects how Windows treats user data. Some data is meant to follow the user between domain-connected systems, some stays with a specific machine, and some is written by sandboxed low-integrity processes. Forensic analysis of these folders helps separate user behavior that is portable from artefacts tied to one endpoint or one restricted execution context.</p>
<h3>The Roaming subfolder</h3>
<p><strong>What it is:</strong> <code>%APPDATA%</code>, which resolves to <code>%USERPROFILE%\AppData\Roaming</code>, stores data intended to follow the user across multiple domain-connected systems. In a traditional Active Directory environment, this can include application preferences, bookmarks, dictionaries, and other portable settings. Notably, in standalone or workgroup setups, Roaming still exists and still holds the same application data &#8211; it just never leaves the machine.</p>
<p><strong>Forensic value:</strong> Roaming often ties application activity to the user rather than to a single machine. It commonly stores core profiles for browsers, messaging tools, FTP clients, and other communication software, making it a useful source of chat histories, saved credentials, bookmarks, and application settings. In domain environments, the same synchronized artefacts appearing across multiple endpoints can help link repeated activity to the same user account. Roaming is also a common location for malware persistence. Standard users can write there without administrative rights, making it a practical drop point for scripts, keyloggers, loaders, and disguised executables (double extension, icon spoofing, masquerading as known app subfolders and so on). Unexpected binaries in Roaming deserve attention.</p>
<h3>The Local subfolder</h3>
<p><strong>What it is:</strong> <code>%LOCALAPPDATA%</code>, or <code>%USERPROFILE%\AppData\Local</code>, stores data tied to the specific machine. It does not roam with the user, even in a domain environment. Windows and applications use it for large caches, temporary content, update packages, installer remnants, and other data that is too bulky or too device-specific to sync efficiently.</p>
<p><strong>Note:</strong> <code>%LOCALAPPDATA%\Programs</code>: this is where user-level application installs typically land (no UAC prompt/admin rights required), making it a common drop location for both legitimate portable apps and malware that needs persistence without elevation.</p>
<p><strong>Forensic value:</strong> Local is often where device-specific activity is recorded. Modern browsers keep much of their cache data here, which can help reconstruct browsing activity, recover fragments of viewed content, and trace downloads in more detail than the visible Downloads folder alone. Local also contains important Windows artefacts, including the centralized thumbnail cache under <code>%LOCALAPPDATA%\Microsoft\Windows\Explorer</code>. Those Thumbcache databases can show images viewed on the system even when the original files are gone. Another high-value location is <code>%LOCALAPPDATA%\Temp</code>, where installers, updaters, and many malicious payloads unpack working files. Timestamps in Temp can help build a detailed execution or installation timeline. Notably, <code>%TEMP%</code> and <code>%TMP%</code> typically resolve there, but some users may override their locations (e.g. to point to a scratch disk).</p>
<h3>The LocalLow subfolder</h3>
<p><strong>What it is:</strong> <code>%USERPROFILE%\AppData\LocalLow</code> is separate from Local for security reasons. It is the designated write location for processes running at low integrity under Windows Mandatory Integrity Control. In practice, sandboxed or partially isolated applications often use LocalLow instead of the standard Roaming or Local paths.</p>
<p><strong>Forensic value:</strong> LocalLow is most relevant in cases involving browsers, app sandboxes, or exploit chains. Low-integrity processes have limited write access, so their caches, temporary files, and session artefacts often end up here. That makes LocalLow useful for tracing protected browser activity, low-integrity execution, and the early stages of web-based compromise. If an attacker gained an initial foothold through a browser sandbox or another constrained process, traces of that activity may remain in LocalLow before privilege escalation or sandbox escape shifted activity elsewhere.</p>
<h3>Configuration folders: the dot-prefixed directories</h3>
<p><strong>What they are:</strong> Windows traditionally stored most per-user settings in the Registry and AppData, but modern cross-platform tools increasingly use Unix-style configuration files and folders in the root of the user profile. Common examples include <code>%USERPROFILE%\.ssh</code>, <code>%USERPROFILE%\.aws</code>, <code>%USERPROFILE%\.vscode</code>, and <code>%USERPROFILE%\.gitconfig</code>. These locations often store authentication material, connection history, and cloud or development tool settings.</p>
<p><strong>Forensic value:</strong> Dot-prefixed artefacts can be highly significant. The <code>.ssh</code> folder may contain private keys such as <code>id_rsa</code> or <code>id_ed25519</code>, along with <code>known_hosts</code>, which records servers the user connected to. That can help map lateral movement and identify access to internal Linux systems or cloud infrastructure. The <code>.aws</code> folder may contain plaintext access keys and secret tokens used by AWS command-line tools. <code>.gitconfig</code> can link the local Windows account to a developer identity through names, email addresses, and repository settings. The <code>.vscode</code> directory can expose workspace settings, remote connection history, and extension data, helping show what repositories were accessed and whether a malicious extension was used. In incident response, these folders often connect a compromised workstation to activity in source code systems, cloud environments, or remote servers.</p>
<h2>Conclusion</h2>
<p>User profile artefacts are some of the most revealing traces on a Windows system, but they rarely make sense in isolation. A file in <code>Downloads</code>, a shortcut in <code>Recent</code>, a thumbnail cache entry, a OneDrive sync log, or a leftover Notepad tab snapshot may each tell only part of the story. The real value comes from correlation. Investigators have to connect visible folders, hidden AppData stores, shell artefacts, cloud traces, and configuration files into a single timeline that explains not just what happened on the machine, but which user likely did it.</p>
<p>That is what makes <code>C:\Users</code> such an important forensic location. System-wide telemetry can show that an event occurred, that software was installed, or that a payload executed. The user profile is where those technical facts start to become attributable. It is where downloaded files meet recently opened documents, where application caches confirm user activity, and where synchronized cloud data or stored authentication material can extend the investigation beyond the local endpoint. During incident response, this kind of granular, localized evidence is often the difference between observing suspicious activity and confidently tying that activity to a specific human operator.</p>
<p>With this article, we conclude our planned series on Windows artefacts. We began with Event Logs and the Registry, moved through file system artefacts under <code>C:\Windows</code> and <code>C:\ProgramData</code>, and finished with the user profile, the part of the system where technical activity and human behavior intersect most directly. Taken together, these sources form a practical map for Windows forensic analysis: broad system telemetry for context, and user-level artefacts for attribution.</p>
<p>Previous articles in the series:</p>
<ol>
<li><a href="https://blog.elcomsoft.com/2026/02/forensic-analysis-of-windows-10-and-11-event-logs/">Forensic Analysis of Windows 10 and 11 Event Logs (ElcomSoft blog)</a></li>
<li><a href="https://blog.elcomsoft.com/2026/02/investigating-windows-registry/">Investigating Windows Registry (ElcomSoft blog)</a></li>
<li><a href="https://blog.elcomsoft.com/2026/03/investigating-windows-file-system-artifacts-under-cwindows/">Investigating Windows File System Artifacts Under C:\Windows (ElcomSoft blog)</a></li>
<li><a href="https://blog.elcomsoft.com/2026/03/windows-file-system-artefacts-under-cprogramdata/">Windows File System Artefacts Under C:\ProgramData (ElcomSoft blog)</a></li>
</ol>
<p><em>Many thanks to the authors and researchers of the prior work cited below. Their research, writeups, and public tooling helped shape both this article and the broader DFIR community.</em></p>
<ol>
<li><a href="https://learn.microsoft.com/en-us/previous-versions/windows/desktop/legacy/bb776892%28v%3Dvs.85%29">About User Profiles (Microsoft Learn)</a></li>
<li><a href="https://learn.microsoft.com/en-us/windows-server/storage/folder-redirection/folder-redirection-rup-overview">Folder Redirection and Roaming User Profiles in Windows and Windows Server (Microsoft Learn)</a></li>
<li><a href="https://elitedigitalforensics.com/windows-forensic-artifacts-user-activity/">Windows Forensic Artifacts User Activity (Elite Digital Forensics)</a></li>
<li><a href="https://redcanary.com/threat-detection-report/techniques/mark-of-the-web-bypass/">Mark of the Web Bypass (Red Canary Threat Detection Report)</a></li>
<li><a href="https://forensics.wiki/thumbs.db/">Thumbs.db (Forensics Wiki)</a></li>
<li><a href="https://forensics.wiki/windows_thumbcache/">Windows thumbcache (Forensics Wiki)</a></li>
<li><a href="https://forensics.wiki/lnk/">Lnk (Forensics Wiki)</a></li>
<li><a href="https://www.swiftforensics.com/2022/02/reading-onedrive-logs.html">Reading OneDrive Logs (Yogesh Khatri&#8217;s forensic blog)</a></li>
<li><a href="https://www.cyberengage.org/post/2-onedrive-forensics-investigating-cloud-storage-on-windows-systems">OneDrive Forensics: Investigating Cloud Storage on Windows Systems (CyberEngage)</a></li>
<li><a href="https://u0041.co/posts/articals/exploring-windows-artifacts-notepad-files/">Exploring windows artifacts notepad files (u0041)</a></li>
<li><a href="https://securelist.com/forensic-artifacts-in-windows-11/117680/">What makes Windows 11 interesting from a digital forensics perspective (Securelist)</a></li>
</ol>
</div>]]></content:encoded>
					
		
		
		<enclosure url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/terminal-3-1200x630-1.png" length="634644" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/terminal-3-1200x630-1.png" width="1200" height="630" medium="image" type="image/jpeg">
	<media:copyright>ElcomSoft blog</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/terminal-3-1200x630-1.png" width="1200" height="630" />
	</item>
		<item>
		<title>AI Agents and Deep Research: A Friday Primer</title>
		<link>https://blog.elcomsoft.com/2026/03/ai-agents-and-deep-research-a-friday-primer/</link>
		
		<dc:creator><![CDATA[Oleg Afonin]]></dc:creator>
		<pubDate>Fri, 06 Mar 2026 11:02:27 +0000</pubDate>
				<category><![CDATA[General]]></category>
		<category><![CDATA[agents]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Friday]]></category>
		<guid isPermaLink="false">https://blog.elcomsoft.com/?p=12925</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2025/10/ai.png" width="1200" height="630" title="" alt="" /></div><div>Spoiler: you are probably already using AI agents, even if marketing hasn&#8217;t yelled at you about it yet. Forget the dark ages of 2023 when large language models (LLMs) just confidently hallucinated fake server logs and nonexistent IP addresses. Today’s AI can spin up a virtual environment, navigate web pages, scrape data, and logically process [&#8230;]</div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2025/10/ai.png" width="1200" height="630" title="" alt="" /></div><div><p>Spoiler: you are probably already using AI agents, even if marketing hasn&#8217;t yelled at you about it yet. Forget the dark ages of 2023 when large language models (LLMs) just confidently hallucinated fake server logs and nonexistent IP addresses. Today’s AI can spin up a virtual environment, navigate web pages, scrape data, and logically process what it finds. Let’s cut through the noise and talk about what &#8220;agents&#8221; actually are, how &#8220;Deep Research&#8221; operates, and how to spin up your own pocket investigator that doesn’t come with corporate safety bumpers.</p>
<h2>From Guessing to Doing</h2>
<p>Remember the old ChatGPT 3.5? It relied entirely on its internal, heavily compressed training data. Ask it to summarize a rare piece of malware, and it would just start guessing to fill the gaps. Ask it to count the &#8216;r&#8217;s in &#8220;strawberry&#8221; or strip footnotes from a forensic report, and it would fail in the classic, predictable way.</p>
<p>But ask a modern model to do the same, and it writes a Python script, runs it, and hands you the correct, algorithmic result. That’s an agent in action.</p>
<p>If you search for the definition of an AI agent, you’ll hit <a href="https://aws.amazon.com/what-is/ai-agents/">AWS documentation</a> saying things like: <em>&#8220;AI agents can take initiative based on forecasts and models of future states.&#8221;</em> Corporate word salad. Here’s what matters:</p>
<ul>
<li><strong>Tools:</strong> Functions a model can trigger (like running code or executing a search query).</li>
<li><strong>Agents:</strong> A model triggering those tools in a continuous loop, checking its own work, and correcting its course.</li>
</ul>
<h2>Deep Research: How It Actually Works</h2>
<p>Most of the hype around agents right now is about &#8220;vibe coding.&#8221; If you&#8217;re a forensic specialist, you probably don&#8217;t care about that. What you care about is Deep Research.</p>
<p>Deep Research isn&#8217;t just a search query; it&#8217;s a multi-step orchestration pipeline. It takes your prompt, breaks it down, and methodically grinds through the internet. Here is the loop:</p>
<ul>
<li><strong>Decomposition:</strong> You ask it to pull a history of vulnerabilities for a specific IoT router. It writes a plan: <em>Find the manufacturer -&gt; locate the specific firmware CVEs -&gt; search GitHub for proof-of-concept exploits.</em></li>
<li><strong>Action:</strong> It triggers a search API.</li>
<li><strong>Parsing:</strong> It pulls down the HTML via a headless browser, shreds the ad banners and navigation, and keeps the raw text.</li>
<li><strong>Reflection:</strong> It reads the text and checks its work. Did it find a CVE, or just a dead forum thread? If it&#8217;s garbage, it flags the source as useless and moves on.</li>
<li><strong>Self-Correction:</strong> If it hits a wall, it broadens the search to the underlying chipset.</li>
<li><strong>Synthesis:</strong> It compiles the valid data into a coherent report with citations.</li>
</ul>
<p>This setup limits hallucinations because the model stops relying on its internal weights to generate facts. It only uses its language capabilities to synthesize the external text it just downloaded.</p>
<p>However, this is cool in demos, painful in operations if you use a weak model. The orchestrator needs actual reasoning capabilities. Here’s where this breaks: if you put a lightweight, easily confused model in the driver&#8217;s seat, it will just Google the exact same useless query for three hours and drain your API credits.</p>
<h2>Going Local: Air-Gapped and Locally Controlled</h2>
<p>Running Deep Research locally is currently one of the most active spaces on GitHub, and it’s not just because nobody wants to burn through expensive API limits. For law enforcement and digital forensics, the cloud is often a complete non-starter due to strict safety filters and basic data custody requirements.</p>
<p>Try feeding a standard commercial model a messy data dump from a suspect’s phone. The second the text hits a discussion about illegal drug logistics, traces of intent to commit violence, or highly sensitive illicit material, the model’s alignment rigidly kicks in. It throws a canned &#8220;I cannot fulfill this request&#8221; error and halts your pipeline. You are trying to parse a legally acquired digital footprint, but the AI&#8217;s commercial guardrails are designed for general consumer safety, not digital forensics.</p>
<p>This is a known friction point. Incident responders and forensic analysts constantly run into brick walls when commercial LLM guardrails actively block the defensive analysis of malware, exploit codes, and raw criminal evidence.</p>
<p>Open-source models have gotten remarkably capable of handling these workloads. Models like the GPT OSS line (the 20B that can run on a potato, or the heavier 120B), the GLM 4.5 Air, and the Qwen 3.5 series are capable local orchestrators that actually know how to &#8220;think&#8221; and use tools. But out of the box, even some of these carry the same sanitized training.</p>
<p>Here is where community tooling catches up to forensic reality: &#8220;abliterated&#8221; models. Developers have stripped out the refusal vectors to reduce refusal behavior. Using these isn&#8217;t about embracing chaos; it&#8217;s about operator control. By deploying a local, unfiltered model, you ensure that the data stays on your local device &#8211; and that the AI will actually process the harsh realities of a criminal dataset without refusing to work halfway through a massive extraction. It keeps the investigation entirely in-house, air-gapped, fully auditable, and firmly in the hands of the examiner &#8211; exactly where the evidence belongs.</p>
<h2>The Tooling</h2>
<p>If you have the hardware, here is how you spin this up:</p>
<ul>
<li><a href="https://github.com/ItzCrazyKns/Perplexica">Perplexica</a>: An open-source clone of Perplexity. Hook it up to your local models and a local SearXNG instance (for anonymized scraping), and you have a private research engine in your browser.</li>
<li><a href="https://github.com/langchain-ai/open_deep_research">Open Deep Research</a> (LangChain) / <a href="https://github.com/zilliztech/deep-searcher">DeepSearcher</a>: Heavier tools designed for massive research tasks. You point them at the web or a massive dump of internal documents, define the logic pipeline, and let them run.</li>
<li>CLI Tools: Fire off a command in the terminal or open a local Web page, and tools like <a href="https://github.com/LearningCircuit/local-deep-research">local-deep-research</a> or <a href="https://github.com/HKUDS/Auto-Deep-Research">Auto-Deep-Research</a> methodically scrape, read, and feed data to your local model, spitting out a cited report.</li>
</ul>
<h2>Can you use AI agents in your workflow? You probably can. But should you really?</h2>
<p>It is incredibly tempting to jump on the bandwagon, point an autonomous, uncensored research loop at a massive 500GB extraction file and tell it to go hunt for anomalies while you grab another coffee. But let&#8217;s take a step back and remember what we are actually dealing with. These tools autonomously write code, execute scripts, and scrape raw data from the absolute worst neighborhoods on the internet.</p>
<p>This is exactly the kind of tech that looks cool in demos, painful in operations.</p>
<p>If your agent decides the best way to analyze an obfuscated script found in a suspect&#8217;s downloads folder is to just execute it, or if it accidentally reaches out to a live command-and-control server while trying to parse a malicious URL, your Friday is effectively over.</p>
<p>Please, for the love of the chain of custody, don’t let it touch production &#8211; let alone the suspect&#8217;s machine &#8211; without containerizing the absolute hell out of the environment it uses to browse and run code. Air-gap the analysis box, strictly sandbox the execution environment, and drop any outbound traffic that isn&#8217;t explicitly required for the search tool.</p>
<p>Treat a local AI agent like a highly caffeinated, incredibly fast junior analyst who has absolutely zero concept of operational security. They are a massive force multiplier for open-source intelligence, threat hunting, and chewing through tedious documentation, freeing you up to do the actual brain work. They just need a babysitter.</p>
<p>So, pull down a local model this weekend. Break it. See how it handles a piece of your backlog. Just keep it in a very, very sturdy box.</p>
</div>]]></content:encoded>
					
		
		
		<enclosure url="https://blog.elcomsoft.com/wp-content/uploads/2025/10/ai.png" length="703392" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2025/10/ai.png" width="1200" height="630" medium="image" type="image/jpeg">
	<media:copyright>ElcomSoft blog</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2025/10/ai.png" width="1200" height="630" />
	</item>
		<item>
		<title>Windows File System Artefacts Under C:\ProgramData</title>
		<link>https://blog.elcomsoft.com/2026/03/windows-file-system-artefacts-under-cprogramdata/</link>
		
		<dc:creator><![CDATA[Oleg Afonin]]></dc:creator>
		<pubDate>Thu, 05 Mar 2026 11:00:04 +0000</pubDate>
				<category><![CDATA[General]]></category>
		<category><![CDATA[EQT]]></category>
		<guid isPermaLink="false">https://blog.elcomsoft.com/?p=12904</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/terminal-2-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div>This guide continues our ongoing series exploring Windows digital artefacts and their practical value during an investigation. Here, we turn our attention to the specific set of files located under the root path %ProgramData% (commonly C:\ProgramData\) and its subfolders. Unlike standard user profile folders, this directory typically houses system-wide data, shared application configurations, and background [&#8230;]</div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://blog.elcomsoft.com/wp-content/uploads/2026/03/terminal-2-1200x630-1.png" width="1200" height="630" title="" alt="" /></div><div><p>This guide continues our ongoing series exploring Windows digital artefacts and their practical value during an investigation. Here, we turn our attention to the specific set of files located under the root path <code>%ProgramData%</code> (commonly <code>C:\ProgramData\</code>) and its subfolders. Unlike standard user profile folders, this directory typically houses system-wide data, shared application configurations, and background service caches that apply to the system as a whole. For investigators, this path offers a system-level perspective. Analyzing it can uncover historical activity, revealing events from background file transfers and software installations to Wi-Fi connections and security tool detections.</p>
<p><strong>Technical Notes</strong></p>
<p>Several of the databases in this directory are held open by system services. Forensically sound acquisition typically requires collecting them from a disk image or using a shadow copy (VSS) mechanism. In this article, <code>%ProgramData%</code> and <code>C:\ProgramData\</code> will be used interchangeably; when analyzing a live system, we recommend initially deriving the physical path of the ProgramData folder by resolving the corresponding environment variable. Both methods are supported in <a href="https://www.elcomsoft.com/eqt.html">Elcomsoft Quick Triage</a>.</p>
<h3>BITS Queue Manager Database (QMGR)</h3>
<p>Background Intelligent Transfer Service (BITS) is a service used by legitimate components to asynchronously transfer large files with minimal user disruption. BITS maintains its job, file, and state data in a local queue manager database located under <code>C:\ProgramData\Microsoft\Network\Downloader\</code>.</p>
<p>On Windows 10 and later, the queue is typically stored as an ESE database (for example, <code>qmgr.db</code>) with accompanying log files; older systems commonly used <code>qmgr0.dat</code> / <code>qmgr1.dat</code>.</p>
<p>From an investigative perspective, this database holds high forensic value because it preserves a system-managed record of transfer intent and metadata. Parsing or carving this database can reveal:</p>
<ul>
<li>Source URLs and destination paths.</li>
<li>Job names, descriptions, and user SIDs associated with the jobs.</li>
<li>The presence of suspicious notification commands that execute upon job completion.</li>
<li>Deleted jobs recovered from slack space or transaction logs.</li>
</ul>
<p><strong>Cross-correlation:</strong> To establish a verifiable timeline and confirm if a transfer actually occurred, correlate the QMGR data with the <code>Microsoft-Windows-Bits-Client/Operational</code> event log (<a href="https://learn.microsoft.com/en-us/windows/win32/bits/background-intelligent-transfer-service-portal">Background Intelligent Transfer Service &#8211; Win32 apps | Microsoft Learn</a>).</p>
<h3>Windows Search Database</h3>
<p>Windows Search Indexer accelerates local searches by maintaining an on-disk index of selected content sources, saved by default under <code>C:\ProgramData\Microsoft\Search\Data\Applications\Windows\</code>. Forensically, this index acts as a secondary catalogue of what the system considered searchable. It provides investigators with indexed file metadata, limited file contents, and traces of user activity or URLs.</p>
<p><strong>Cross-correlation:</strong> Because &#8220;indexed&#8221; does not guarantee &#8220;executed,&#8221; cross-correlate this database with file-system timeline artefacts, application logs, or endpoint telemetry to defensibly confirm a file&#8217;s presence, access, and timing.</p>
<h3>Windows Error Reporting (WER) Store</h3>
<p>Windows Error Reporting gathers information about hardware and software faults, storing pending and archived reports on disk under <code>C:\ProgramData\Microsoft\Windows\WER\</code>.</p>
<p>While primarily about crashes, WER yields strong execution-adjacent signals. A report directory containing an <code>AppCrash_*</code> pattern and a <code>Report.wer</code> file is strong evidence that a given executable ran and faulted on the system. It typically includes timestamps, executable identifiers, and contextual strings about the failing module.</p>
<p><strong>Cross-correlation:</strong> To establish causality, correlate WER artefacts with Application event log entries (such as &#8220;Application Error&#8221;) and any logs specific to the crashing process.</p>
<h3>Microsoft Defender Artefacts</h3>
<p>The following artefacts are produced by Microsoft Defender. Analyzing these files reveals threat detection history.</p>
<p><strong>Microsoft Defender Antivirus Detection History</strong></p>
<p>When Defender&#8217;s real-time protection detects and blocks or remediates threats, it creates <code>DetectionHistory</code> records under <code>C:\ProgramData\Microsoft\Windows Defender\Scans\History\Service\DetectionHistory\</code>.</p>
<p>These files offer host-native security telemetry. They contain data about detections, including the threat name, malicious file location, detection timestamp, cryptographic hashes, and (depending on the detection) initiating or associated process information.</p>
<p><strong>Cross-correlation:</strong> Validate these findings by correlating them with the <code>Microsoft-Windows-Windows Defender/Operational</code> event log, which provides canonical event IDs for malware detection and configuration changes.</p>
<p><strong>Microsoft Defender Antivirus Quarantine</strong></p>
<p>Defender stores encrypted quarantine metadata and quarantined file contents under <code>C:\ProgramData\Microsoft\Windows Defender\Quarantine\</code>.</p>
<p>Quarantine contents typically preserve the quarantined payload alongside structured metadata, stored in Defender’s encrypted quarantine container format. This can allow for file recovery and deeper analysis even if the primary event logs have been cleared, though availability may depend on retention and cleanup policies.</p>
<p><strong>Cross-correlation:</strong> Cross-correlate quarantine data with <code>DetectionHistory</code> to understand the user-facing narrative, and with Defender Operational event logs to confirm timing and remediation steps.</p>
<p><strong>Microsoft Defender Support Logs</strong></p>
<p>Defender generates plaintext troubleshooting logs (<code>MPLog-*.log</code>) and support archives under <code>C:\ProgramData\Microsoft\Windows Defender\Support\</code>.</p>
<p>These logs can carry surprisingly rich historical evidence of process execution, detected threats, scan results, and file existence. They commonly use UTC timestamps and can help identify process execution and file access during an incident.</p>
<p><strong>Cross-correlation:</strong> Correlate MPLog observations with Defender Operational events and with <code>DetectionHistory</code> or quarantine evidence.</p>
<h3>Wi-Fi Profile XMLs</h3>
<p>Windows stores Wi-Fi profiles as XML files under <code>C:\ProgramData\Microsoft\Wlansvc\Profiles\Interfaces\{GUID}\</code>.</p>
<p>These XMLs provide a reliable inventory of Wi-Fi networks configured on the device, frequently revealing SSIDs, authentication types, and encryption parameters. This is useful for supporting hypotheses about physical movement or corporate network access.</p>
<p><strong>Cross-correlation:</strong> To determine when connections actually occurred, cross-correlate these profiles with Wi-Fi session evidence in the WLAN-AutoConfig event logs.</p>
<h3>Wireless Network Report HTML</h3>
<p>Generated manually via the command line <code>netsh wlan show wlanreport</code>, this HTML report summarizes Wi-Fi events for the past three days and is typically saved at <code>C:\ProgramData\Microsoft\Windows\WlanReport\wlan-report-latest.html</code>.</p>
<p>When present, it offers a human-readable timeline of connection sessions, disconnect reasons, and related adapter context. <a href="https://support.microsoft.com/en-us/windows/analyze-the-wireless-network-report-76da0daa-1db2-6049-d154-7bb679eb03ed">Analyze the wireless network report &#8211; Microsoft Support</a></p>
<p><strong>Cross-correlation:</strong> Treat the report as a convenient aggregation and corroborate specific connect or disconnect timestamps with the underlying system logs it summarizes.</p>
<h3>Local Group Policy cache (History)</h3>
<p>To manage the removal of policies that no longer apply, Windows maintains a per-machine local Group Policy cache under <code>C:\ProgramData\Microsoft\Group Policy\History</code>.</p>
<p>This cache can help demonstrate that specific preference-based actions (like creating or deleting configuration objects) were applied to the endpoint, which is useful when domain-side evidence is missing.</p>
<p><strong>Cross-correlation:</strong> Correlate this cache with Group Policy operational logs and Registry or application-state evidence to support conclusions about the lasting effects of the preference items.</p>
<h3>StateRepository-Machine.srd</h3>
<p>Located under <code>C:\ProgramData\Microsoft\Windows\AppRepository\</code>, this database records the state of installed modern applications.</p>
<p>This artefact supports software inventory tasks, revealing what Store or UWP packages were present. It logs apps currently installed, apps installed but never launched, and preinstalled apps.</p>
<p><strong>Cross-correlation:</strong> To confirm exactly when an app was installed and who installed it, cross-correlate with Registry mappings of user IDs and relevant deployment event logs.</p>
<h3>All-Users Start Menu Programs and Startup Folder</h3>
<p>Windows maintains common Start Menu and Startup paths under <code>C:\ProgramData\Microsoft\Windows\Start Menu\Programs</code> and  <code>C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Startup</code> correspondingly.</p>
<p>The Startup folder is a frequent persistence location, executing its contents at user logon. Note that this is only one of many autorun and persistence surfaces; interpret findings as part of a broader autoruns review. Analyzing the LNK (shortcut) files found here allows investigators to:</p>
<ul>
<li>Detect malware persistence.</li>
<li>Recover historical information, even for deleted targets or network shares.</li>
<li>Use internal LNK timestamps and volume info to tie execution to specific devices.</li>
</ul>
<p><strong>Cross-correlation:</strong> Supplement shortcut analysis by correlating with Registry-based autoruns and file creation or auditing event logs.</p>
<h3>OpenSSH Server Configuration</h3>
<p>When enabled, the OpenSSH Server reads its configuration from <code>C:\ProgramData\ssh\sshd_config</code>.</p>
<p>The presence of this configuration file indicates OpenSSH Server may be installed and/or configured on the host, which can be relevant to remote-access and lateral-movement investigations. Confirm operational status by checking the <code>sshd</code> service state (installed/start type/running) and related configuration. The file reveals authentication settings and allowed users.</p>
<p><strong>Cross-correlation:</strong> Correlate with service installation state, firewall rules, and authentication logs to build a complete access narrative.</p>
<h2>Noisy and Low-Signal Artefacts</h2>
<p>We filtered out certain <code>C:\ProgramData</code> artefacts from our primary analysis because they tend to be high-churn, diagnostic in nature, or too ambiguous without substantial auxiliary context. These are primarily relevant for general troubleshooting rather than reconstructing adversary or user behavior.</p>
<p><strong>Update Session Orchestrator ETL Logs</strong></p>
<p>Found under <code>C:\ProgramData\USOShared\Logs\</code>, these event trace logs are produced for Windows update orchestration diagnostics. They are too voluminous for targeted triage and mostly contain routine OS update activity.</p>
<p><strong>DeviceMetadataCache</strong></p>
<p>Located at <code>C:\ProgramData\Microsoft\Windows\DeviceMetadataCache\</code>, this directory caches OS device metadata packages. It is a benign maintenance cache with low forensic relevance.</p>
<p><strong>Delivery Optimization Caches</strong></p>
<p>These files assist with update distribution. They represent routine, noisy background updates and are weakly attributable to specific user actions.</p>
<p><strong>Microsoft Defender Platform/Engine Folders</strong></p>
<p>Found under <code>C:\ProgramData\Microsoft\Windows Defender\Platform\</code>, these directories contain frequent signature and engine updates. They are too volatile and weakly tied to discrete actions compared to other primary Defender artefacts.</p>
<p><strong>Third-Party Application Data</strong></p>
<p>While extensive third-party data exists in <code>C:\ProgramData\</code>, these artefacts are product-specific and not reliably generalizable across standard Windows systems, making them exceptionally noisy without knowing the exact software inventory.</p>
<h2>Conclusion</h2>
<p>As explored in this guide, the <code>C:\ProgramData</code> directory contains high-signal artefacts that provide a crucial, system-level perspective during an investigation. From uncovering background transfer intent and host-native security telemetry to reconstructing connection timelines, this path offers a reliable inventory of what happened on a specific endpoint.</p>
<p>However, even the best forensic tools won’t make investigative decisions for you. Parsers can organize data, but interpreting intent, building a clean timeline, and supporting attribution still takes an investigator applying informed judgment. To streamline collection and save time for analysis, consider using <a href="https://www.elcomsoft.com/eqt.html">Elcomsoft Quick Triage</a> to collect relevant artefacts from live systems, disk images, or mounted volumes.</p>
<p>We are grateful to the members of the forensic community whose research continues to drive the industry forward:</p>
<ol>
<li><a href="https://artefacts.help/windows_search_db.html">Windows Search database (artifacts.help)</a></li>
<li><a href="https://cloud.google.com/blog/topics/threat-intelligence/attacker-use-of-windows-background-intelligent-transfer-service/">Attacker Use Background Intelligent Transfer Service (BITS) (Google Cloud Blog)</a></li>
<li><a href="https://learn.microsoft.com/en-us/windows/win32/wer/windows-error-reporting">Windows Error Reporting &#8211; Win32 apps (Microsoft Learn)</a></li>
<li><a href="https://www.sans.org/blog/uncovering-windows-defender-real-time-protection-history-with-dhparser">Uncovering Windows Defender Real-time Protection History with DHParser (SANS Alumni Blog)</a></li>
<li><a href="https://learn.microsoft.com/en-us/defender-endpoint/troubleshoot-microsoft-defender-antivirus">Microsoft Defender Antivirus event IDs and error codes (Microsoft Learn)</a></li>
<li><a href="https://www.nccgroup.com/research-blog/reverse-reveal-recover-windows-defender-quarantine-forensics/">Reverse, Reveal, Recover: Windows Defender Quarantine Forensics (NCC Group)</a></li>
<li><a href="https://www.crowdstrike.com/en-us/blog/how-to-use-microsoft-protection-logging-for-forensic-investigations/">How to Use MPLogs for Forensic Investigations (CrowdStrike)</a></li>
<li><a href="https://support.microsoft.com/en-us/windows/analyze-the-wireless-network-report-76da0daa-1db2-6049-d154-7bb679eb03ed">Analyze the wireless network report (Microsoft Support)</a></li>
<li><a href="https://boncaldoforensics.wordpress.com/2018/10/07/all-installed-apps-artifact-windows-10-forensics/">“All Installed Apps” Artifact &#8211; Windows 10 Forensics, (Boncaldo&#8217;s Forensics Blog)</a></li>
<li><a href="https://attack.mitre.org/techniques/T1547/001/">Boot or Logon Autostart Execution: Registry Run Keys (MITRE ATT&amp;CK)</a></li>
<li><a href="https://cloud.google.com/blog/topics/threat-intelligence/the-missing-lnk-correlating-user-search-lnk-files">The Missing LNK &#8211; Correlating User Search LNK files (Google Cloud Blog)</a></li>
</ol>
</div>]]></content:encoded>
					
		
		
		<enclosure url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/terminal-2-1200x630-1.png" length="856604" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/terminal-2-1200x630-1.png" width="1200" height="630" medium="image" type="image/jpeg">
	<media:copyright>ElcomSoft blog</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blog.elcomsoft.com/wp-content/uploads/2026/03/terminal-2-1200x630-1.png" width="1200" height="630" />
	</item>
	</channel>
</rss>
