2
<p><span class="h-card" translate="no"><a href="https://ioc.exchange/@azonenberg" class="u-url mention">@<span>azonenberg</span></a></span> i def should build a design for this and then try and break it</p>
<p><span class="h-card" translate="no"><a href="https://ioc.exchange/@azonenberg" class="u-url mention">@<span>azonenberg</span></a></span> <span class="h-card" translate="no"><a href="https://chaos.social/@dlharmon" class="u-url mention">@<span>dlharmon</span></a></span> how did you know it&#39;s metastability specifically and not just jitter resulting in capturing the wrong thing sometimes (i.e. a normal race)?</p>
<p><span class="h-card" translate="no"><a href="https://mastodon.social/@whitequark" class="u-url mention">@<span>whitequark</span></a></span> Fair enough.</p>
<p><span class="h-card" translate="no"><a href="https://mastodon.social/@whitequark" class="u-url mention">@<span>whitequark</span></a></span> <span class="h-card" translate="no"><a href="https://chaos.social/@dlharmon" class="u-url mention">@<span>dlharmon</span></a></span> Yeah modern nodes have very short windows which makes catching metastability related bugs tricky.</p><p>~5 years ago I caught one in a UART core I had been using for quite a long time that would occasionally drop a byte when I was sending a lot of data. Turns out I wasn&#39;t synchronizing the input properly (i.e. at all).</p>
<p><span class="h-card" translate="no"><a href="https://ioc.exchange/@azonenberg" class="u-url mention">@<span>azonenberg</span></a></span> i&#39;ve seen enough atrocious MCU RNG implementations that i don&#39;t even trust them to do that</p><p>what was the last thing to be broken, pico&#39;s PRNG? at least for the FPGA one, i know the ways in which it will be bad. the ST one is opaque</p>
<p><span class="h-card" translate="no"><a href="https://chaos.social/@dlharmon" class="u-url mention">@<span>dlharmon</span></a></span> <span class="h-card" translate="no"><a href="https://ioc.exchange/@azonenberg" class="u-url mention">@<span>azonenberg</span></a></span> re: metastability, i&#39;ve wondered about it too but it seems that on recent logic the window is absurdly small</p>
<p><span class="h-card" translate="no"><a href="https://chaos.social/@uliwitness" class="u-url mention">@<span>uliwitness</span></a></span> your most realistic bet is probably emulating an entire macOS in UTM.</p>
<p><span class="h-card" translate="no"><a href="https://chaos.social/@uliwitness" class="u-url mention">@<span>uliwitness</span></a></span> </p><p>&gt; Is there an Apple Silicon version of Parallels that will emulate an Intel CPU? </p><p>No, Parallels has never shipped an emulator.</p><p>Rosetta only emulates at the app level, so you may be able to run a Carbon app with it, but not an entire VM.</p>
<p><span class="h-card" translate="no"><a href="https://mastodon.social/@whitequark" class="u-url mention">@<span>whitequark</span></a></span> Fair enough, but ARM or Cadence or Synopsys or whoever they licensed the IP from probably didn&#39;t do the worst job.</p><p>Long term I want to build a randomness pool that&#39;s seeded by the TRNG over time and hashes itself to accumulate entropy even if the TRNG output is somewhat biased.</p>