Whole-known-network
<p><span class="h-card" translate="no"><a href="https://mastodon.social/@shriramk" class="u-url mention">@<span>shriramk</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.nu/@richcarl" class="u-url mention">@<span>richcarl</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.social/@DRMacIver" class="u-url mention">@<span>DRMacIver</span></a></span> My impression is that the strong "glorified auto_" criticism comes as a Newtonian counterreaction to completely ridiculous claims, by people with blatant ulterior technocratic and economic motives, that these probabilistic models will offer us solutions to climate change and other global crises, and thereby justify their indiscriminate development and use at any cost. It's saying, "yes, these things are great, but they're just completing sentences/paragraphs/novels for you, and not an excuse to abdicate your critical thinking facilities"</p>
<p><span class="h-card" translate="no"><a href="https://mastodon.social/@film_girl" class="u-url mention">@<span>film_girl</span></a></span> <br />hmm i'd put linux on your 2018. My 2008 pro apple no longer supports updates so linux it is ... ๐ค</p>
<p><span class="h-card" translate="no"><a href="https://masto.ai/@jeromechoo" class="u-url mention">@<span>jeromechoo</span></a></span> itโs so true. She went from a Motorola Razr to an iPhone X. And whatโs wild is she was on her 3rd iPad by 2017 I think. She just took forever to get an iPhone.</p>
<p><span class="h-card" translate="no"><a href="https://mastodon.social/@shriramk" class="u-url mention">@<span>shriramk</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.nu/@richcarl" class="u-url mention">@<span>richcarl</span></a></span> I think this is only true if you assume your conclusion! In a pretty fundamental sense LLMs are doing the same thing as an auto complete, just much much better, so it's only better than any auto complete if you deliberately draw the boundary based on how good it is</p>
<p><span class="h-card" translate="no"><a href="https://mastodon.social/@film_girl" class="u-url mention">@<span>film_girl</span></a></span> a RAZR??? She was the coolest mom without even realizing it. I had to switch my mom from an Honor that broke every year.</p>
<p><span class="h-card" translate="no"><a href="https://mastodon.social/@shriramk" class="u-url mention">@<span>shriramk</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.nu/@richcarl" class="u-url mention">@<span>richcarl</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.social/@DRMacIver" class="u-url mention">@<span>DRMacIver</span></a></span> The coherent output that state-of-the-art transformer-based LLMs produce over global, long-range dependencies in data is a deliberate design feature of their architecture, and certainly blew everyone's expectations out of the water. It's still, however, an auto"complete" because these architectures still don't have any *intentionality* designed into them, and still function by producing *most probable* outputs.</p>
<p><span class="h-card" translate="no"><a href="https://mastodon.social/@film_girl" class="u-url mention">@<span>film_girl</span></a></span> What was the problem with the 2018 Air? Butterfly keyboard, or something else?</p>
<p>The 2018 retina MacBook Air had a fan. A loud fan. Itโs so loud just transferring data over Thunderbolt. God. What a piece of shit computer that we all desperately wanted to be good.</p>
<p><span class="h-card" translate="no"><a href="https://masto.ai/@jeromechoo" class="u-url mention">@<span>jeromechoo</span></a></span> since 2010. She went iPad in 2010 and then I got her to switch to Mac. Then she got an iPhone like YEARS later. She was on a Razr for forever.</p>