est.social on üks paljudest sõltumatutest Mastodoni serveritest, mida saab fediversumis osalemiseks kasutada.
est.social on mõeldud Eestis üldkasutatavaks Mastodoni serveriks. est.social is meant to be a general use Mastodon server for Estonia.

Administraator:

Serveri statistika:

87
aktiivsed kasutajad

#LLM

61 postitusega43 osalejaga10 postitust täna
Osma Suominen<p>Indeed, the CPU-only performance is even worse. The LocalScore on the tiny 1B model is only 16, with a text generation speed of 7.7 tokens/second.</p><p><a href="https://www.localscore.ai/result/235" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">localscore.ai/result/235</span><span class="invisible"></span></a></p><p>Let's see if I can run this on a Raspberry Pi for comparison...</p><p><a href="https://sigmoid.social/tags/LocalScore" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LocalScore</span></a> <a href="https://sigmoid.social/tags/llm" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>llm</span></a> <a href="https://sigmoid.social/tags/benchmark" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>benchmark</span></a> <a href="https://sigmoid.social/tags/LocalLlama" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LocalLlama</span></a></p>
Osma Suominen<p>My hobby: running LocalScore.ai to benchmark how fast (ehm) my 2018 laptop runs a tiny 1B LLM. The laptop has a NVIDIA MX150 mobile GPU, 2GB VRAM. I guess it was intended for Photoshop filters or CAD stuff.</p><p>I got a LocalScore of 101 on the tiny model using the GPU (13.5 tokens/second for generation). A value of around 1000 is considered passable.</p><p><a href="https://www.localscore.ai/accelerator/234" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">localscore.ai/accelerator/234</span><span class="invisible"></span></a></p><p>Still, I think it's marginally better than CPU-only on the same laptop.</p><p><a href="https://sigmoid.social/tags/llm" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>llm</span></a> <a href="https://sigmoid.social/tags/benchmark" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>benchmark</span></a> <a href="https://sigmoid.social/tags/LocalLlama" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LocalLlama</span></a> <a href="https://sigmoid.social/tags/LocalScore" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LocalScore</span></a></p>
Antonio Lieto<p>Publication News: the paper "Eliciting metaknowledge in Large Language Models" by Fabio Longo Miseal Mongiovì Luana Bulla &amp; myself has been published in the journal Cognitive Systems Research (Elsevier). Link (50 days free access): <a href="https://authors.elsevier.com/a/1ktLp4xrDwcCg4" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">authors.elsevier.com/a/1ktLp4x</span><span class="invisible">rDwcCg4</span></a></p><p><a href="https://fediscience.org/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> <a href="https://fediscience.org/tags/generativeAI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>generativeAI</span></a> <a href="https://fediscience.org/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> </p><p><span class="h-card" translate="no"><a href="https://a.gup.pe/u/academicchatter" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>academicchatter</span></a></span> <span class="h-card" translate="no"><a href="https://a.gup.pe/u/cognition" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>cognition</span></a></span></p>
*|FNAME|* 🇨🇦🇺🇦🇬🇱<p><span class="h-card" translate="no"><a href="https://mstdn.social/@JamesWNeal" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>JamesWNeal</span></a></span> <br>Yes, this!</p><p>The <a href="https://infosec.exchange/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a>/ <a href="https://infosec.exchange/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> gold rush isn’t about who has the best tech (the Chinese model demonstrated this), it’s about who can soak up the most data. </p><p>These techbro jabronis believe that if only their model can just suck up enough data it’ll somehow become sentient. 🥴</p>
Kevin Karhan :verified:<p>The same people who think <a href="https://infosec.space/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> can become intelligent on their own also think that letting an elementary school child have unlimited access to Wikipedia is a valid replacement for academic training and [under-]grad studies!</p><p><a href="https://partyon.xyz/@nullagent/114276196358744510" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">partyon.xyz/@nullagent/1142761</span><span class="invisible">96358744510</span></a></p>
⚯ Michel de Cryptadamus ⚯<p>Really feels like an LLM got the idea of importing and exporting data mixed up with the concept of data about the import and export of goods when it was writing this guidance on tariff calculations...</p><p>&gt; "Parameter Selection: To calculate reciprocal tariffs, import and export data from the U.S. Census Bureau for 2024."</p><p>Seems like it should read "To calculate reciprocal tariffs, import import and export data..." but the LLM tokenizer didn't like using the word "import" twice in a row.</p><p><a href="https://universeodon.com/tags/uspol" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>uspol</span></a> <a href="https://universeodon.com/tags/tariffs" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>tariffs</span></a> <a href="https://universeodon.com/tags/Trump" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Trump</span></a> <a href="https://universeodon.com/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> <a href="https://universeodon.com/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a></p>
Marcel Waldvogel<p>I very often agree with Bruce Schneier. But not today.</p><p>If I wanted to make a private agreement through a digital trusted third party, why would I need an LLM?</p><p>The examples include comparing salaries. Instead of setting up (and later securely deleting) an LLM, we could just as easily run a function boiling down to<br>`return a &gt; b;`</p><p>No need to involve LLMs with their uncertainty or possibility to do prompt injection.<br><a href="https://waldvogel.family/tags/BruceSchneier" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>BruceSchneier</span></a> <a href="https://waldvogel.family/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> <a href="https://waldvogel.family/tags/TTP" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>TTP</span></a> <br><a href="https://www.schneier.com/blog/archives/2025/03/ais-as-trusted-third-parties.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">schneier.com/blog/archives/202</span><span class="invisible">5/03/ais-as-trusted-third-parties.html</span></a></p>
🏴󠁧󠁢󠁷󠁬󠁳󠁿 rhys 🏴󠁧󠁢󠁷󠁬󠁳󠁿<p>Everyone out here trying to figure out how the <a href="https://mastodon.rhys.wtf/tags/Trump" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Trump</span></a> administration derived its insane methodology for calculating tariff rates and the likely answer is much simpler and much stupider than anyone imagined.</p><p><a href="https://mastodon.rhys.wtf/tags/tariffs" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>tariffs</span></a> <a href="https://mastodon.rhys.wtf/tags/ai" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ai</span></a> <a href="https://mastodon.rhys.wtf/tags/llm" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>llm</span></a> <a href="https://mastodon.rhys.wtf/tags/ChatGPT" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ChatGPT</span></a> <a href="https://mastodon.rhys.wtf/tags/USPolitics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>USPolitics</span></a> <a href="https://mastodon.rhys.wtf/tags/USPol" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>USPol</span></a></p>
Holle Meding<p>📚 Extracting Citations with LLMs</p><p>At the <a href="https://mastodon.social/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> for HPSS workshop, <span class="h-card" translate="no"><a href="https://sciences.social/@cmboulanger" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>cmboulanger</span></a></span> David Carreto Fidalgo &amp; Andreas Wagner presented LLaMore: a Python tool for extracting citation data from unstructured legal &amp; humanities texts using <a href="https://mastodon.social/tags/LLMs" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLMs</span></a> </p><p>Unlike GROBID, LLaMore handles complex footnotes and free-form references. Early results with GPT-4o and Llama 3.3 show significantly higher accuracy when benchmarked against a new gold standard TEI-annotated dataset.</p><p><a href="https://mastodon.social/tags/TEI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>TEI</span></a> <a href="https://mastodon.social/tags/openscience" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>openscience</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@maxplanckgesellschaft" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>maxplanckgesellschaft</span></a></span></p>
switching.software<p>This is how to disable the new “AI” chatbot in <a href="https://fedifreu.de/tags/Firefox" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Firefox</span></a>:</p><ul><li>type <code>about:config</code> into the awesome bar</li><li>skip the warning for first time modders</li><li>locate the <code>browser.ml.chat.enabled</code> setting and set it to <code>false</code></li></ul><p>In the <a href="https://fedifreu.de/tags/Librewolf" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Librewolf</span></a> fork, a thoughtful person has already done this for you.</p><p>(HT to <span class="h-card" translate="no"><a href="https://social.tchncs.de/@kuketzblog" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>kuketzblog</span></a></span> for the hint!)</p><p><a href="https://fedifreu.de/tags/genAI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>genAI</span></a> <a href="https://fedifreu.de/tags/chatbot" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>chatbot</span></a> <a href="https://fedifreu.de/tags/bullshitgenerator" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>bullshitgenerator</span></a> <a href="https://fedifreu.de/tags/llm" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>llm</span></a> <a href="https://fedifreu.de/tags/dontCallItAI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>dontCallItAI</span></a></p>
Androcat<p>If you understand Virtue Epistomology (VE), you cannot accept any LLM output as "information".</p><p>VE is an attempt to correct the various omniscience-problems inherent in classical epistemologies, which all to some extent require a person to know what the Truth is in order to evaluate if some statement is true.</p><p>VE prescribes that we should look to how the information was obtained, particularly in two ways:<br>1) Was the information obtained using a well-known method that is known to produce good results?<br>2) Does the method appear to have been applied correctly in this particular case?</p><p>LLM output always fails on pt1. An LLM will not look for the truth. It will just look for what is a probable combination of words. This means that an LLM is just as likely to combine a number of true statements in a way that is probable but false, as it is to combine them in a way that is probable and true. </p><p>LLMs only sample the probability of word combinations. It doesn't understand the input, and it doesn't understand its own output.</p><p>Only a damned fool would use it for anything, ever.</p><p><a href="https://toot.cat/tags/epistemology" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>epistemology</span></a> <a href="https://toot.cat/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> <a href="https://toot.cat/tags/generativeAI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>generativeAI</span></a> <a href="https://toot.cat/tags/ArtificialIntelligence" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ArtificialIntelligence</span></a> <a href="https://toot.cat/tags/ArtificialStupidity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ArtificialStupidity</span></a> <span class="h-card" translate="no"><a href="https://a.gup.pe/u/philosophy" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>philosophy</span></a></span></p>
Pavel A. Samsonov<p>One fun thing about The Industry is that the same people are writing "AI is bad" and "here's how to use AI" content.</p><p>What's likelier: that you happened upon the one secret ethical and effective AI use case, or that it's easier to ignore harms when you're the one doing them?</p><p><a href="https://mastodon.social/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> <a href="https://mastodon.social/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> <a href="https://mastodon.social/tags/GenAI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>GenAI</span></a></p>
AJ Sadauskas<p><span class="h-card" translate="no"><a href="https://aus.social/@skribe" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>skribe</span></a></span> Conversely, the cost of printing, distribution, and storage puts up a barrier to spamming people on other continents with mass quantities of low value slop. </p><p>Just think through the logistics of a hostile Eurasian state sending a mass quantity of printed materials to Australia or North America.</p><p>Or, for that matter, a hostile North American state sending a mass quantity of printed materials to Europe or Asia.</p><p>You would either need:–</p><p>a) At least one printing press on each continent;<br>b) You could try shipping the magazines, but they'd be a month out of date when they arrive; or<br>c) You could try flying them overseas, but that would be very expensive very quickly.</p><p>That's before you worry about things like delivery drivers (or postage), and warehouses.</p><p>These are less of an issue for books than they are for newspapers or magazines.</p><p>And if a particular newspaper or magazine is known to be reliable, written by humans, researched offline, and the articles are not available online, then there's potentially value in people buying a physical copy.</p><p><a href="https://social.vivaldi.net/tags/ChatGPT" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ChatGPT</span></a> <a href="https://social.vivaldi.net/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> <a href="https://social.vivaldi.net/tags/LargeLanguageModel" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LargeLanguageModel</span></a> <a href="https://social.vivaldi.net/tags/LargeLanguageModels" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LargeLanguageModels</span></a> <a href="https://social.vivaldi.net/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> <a href="https://social.vivaldi.net/tags/ArtificialIntelligence" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ArtificialIntelligence</span></a> <a href="https://social.vivaldi.net/tags/GenAI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>GenAI</span></a> <a href="https://social.vivaldi.net/tags/spam" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>spam</span></a> <a href="https://social.vivaldi.net/tags/news" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>news</span></a> <a href="https://social.vivaldi.net/tags/politics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>politics</span></a> <a href="https://social.vivaldi.net/tags/business" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>business</span></a> <a href="https://social.vivaldi.net/tags/media" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>media</span></a> <a href="https://social.vivaldi.net/tags/meta" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>meta</span></a> <a href="https://social.vivaldi.net/tags/Facebook" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Facebook</span></a> <a href="https://social.vivaldi.net/tags/Google" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Google</span></a> <a href="https://social.vivaldi.net/tags/Gemini" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Gemini</span></a></p>
Alvin Ashcraft 🐿️<p>GitHub for Beginners: How to get LLMs to do what you want.</p><p><a href="https://github.blog/ai-and-ml/github-copilot/github-for-beginners-how-to-get-llms-to-do-what-you-want/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">github.blog/ai-and-ml/github-c</span><span class="invisible">opilot/github-for-beginners-how-to-get-llms-to-do-what-you-want/</span></a></p><p><a href="https://hachyderm.io/tags/ai" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ai</span></a> <a href="https://hachyderm.io/tags/github" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>github</span></a> <a href="https://hachyderm.io/tags/learning" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>learning</span></a> <a href="https://hachyderm.io/tags/llm" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>llm</span></a> <a href="https://hachyderm.io/tags/aimodels" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>aimodels</span></a></p>
Pavel A. Samsonov<p>In ancient times, the japes of fools and jesters were heeded as warnings from the gods. I have not spent this much time burnishing my jester credentials for nothing -- dashbots are coming and they will ruin everything. <a href="https://mastodon.social/tags/UXDesign" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>UXDesign</span></a> <a href="https://mastodon.social/tags/UX" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>UX</span></a> <a href="https://mastodon.social/tags/ProductManagement" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ProductManagement</span></a> <a href="https://mastodon.social/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> <a href="https://mastodon.social/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> <a href="https://mastodon.social/tags/GenAI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>GenAI</span></a> <a href="https://mastodon.social/tags/B2B" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>B2B</span></a><br> <br><a href="https://spavel.medium.com/dashbots-the-inevitable-fusion-of-dashboards-and-chatbots-4de4a64d1f5f" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">spavel.medium.com/dashbots-the</span><span class="invisible">-inevitable-fusion-of-dashboards-and-chatbots-4de4a64d1f5f</span></a></p>
Not A Convicted Felon<p>You know, we invented systems before there were computers. <br>'Forms' were on paper, rather than on screens.<br>An 'in tray' was an actual metal wire, or wooden tray, for paper letters, notes, memos and forms.<br>A database was called a 'filing cabinet'.<br>An 'interface' was a mail box.<br>A 'front end' was a person, with a job title like administrator, or clerk.<br>These systems were described, in excruciating detail, in procedure manuals.<br>The processes were run not by CPUs, but by people.<br>'Bugs' were when people made mistakes.</p><p>Systems were difficult to understand, even harder to diagnose, and very very hard to fix or change.<br>To change the way a department worked, for e.g. accounts receivable was so hard that most companies never even tried.</p><p>And yet somehow people are under the impression that it is the code that is the difficult bit about modern business systems. <br>So they try and make the code part easier. <br><a href="https://hachyderm.io/tags/LowCode" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LowCode</span></a> <a href="https://hachyderm.io/tags/LoCode" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LoCode</span></a> <a href="https://hachyderm.io/tags/NoCode" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>NoCode</span></a> <a href="https://hachyderm.io/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> <a href="https://hachyderm.io/tags/GenAI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>GenAI</span></a> <a href="https://hachyderm.io/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> </p><p>It was never the code. Code was never the bottleneck.</p><p><a href="https://raganwald.com/2012/01/08/duck-programming.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">raganwald.com/2012/01/08/duck-</span><span class="invisible">programming.html</span></a></p>
Pavel A. Samsonov<p>"If all you do is automate something which is lousy, you’re going to get a lousy result" <a href="https://mastodon.social/tags/ai" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ai</span></a> <a href="https://mastodon.social/tags/genai" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>genai</span></a> <a href="https://mastodon.social/tags/llm" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>llm</span></a> <a href="https://mitsloan.mit.edu/ideas-made-to-matter/lure-so-so-technology-and-how-to-avoid-it" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">mitsloan.mit.edu/ideas-made-to</span><span class="invisible">-matter/lure-so-so-technology-and-how-to-avoid-it</span></a></p>
Simon 🐮<p>I created a handy <a href="https://cloudisland.nz/tags/flowchart" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>flowchart</span></a> for <a href="https://cloudisland.nz/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> &amp; <a href="https://cloudisland.nz/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> usage</p>
Florian Ledermann<p>I've been toying a bit with using LLM's for generating solutions for simple spatial analysis tasks (clustering etc.). </p><p>Oh boy! 😔 This stuff will *cost lives* as it will inevitably be used in real-world applications in contexts that are not robust to bullshitting.</p><p>Not to speak of the countless hours wasted by experienced folks to fix the stuff junior "developers" come up with.</p><p>Productivity boost my a**. This will cause maintenance/debugging/cleanup work of unprecedented scale.</p><p><a href="https://mapstodon.space/tags/llm" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>llm</span></a> <a href="https://mapstodon.space/tags/chatgpt" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>chatgpt</span></a></p>
David :SetouchiExplorer:<p>I'm seeing more and more (including in the Fediverse!!!) people saying "I asked ChatGPT" instead of "I googled it" and I find this terrifying.</p><p>More than anything else, this makes me say "we're all doomed."</p><p><a href="https://setouchi.social/tags/ChatGPT" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ChatGPT</span></a> <a href="https://setouchi.social/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> <a href="https://setouchi.social/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a></p>