{"id":244,"date":"2025-09-10T17:02:04","date_gmt":"2025-09-10T17:02:04","guid":{"rendered":"https:\/\/farhan.marketingcie.com\/?p=244"},"modified":"2025-09-10T17:02:04","modified_gmt":"2025-09-10T17:02:04","slug":"grok-can-rewrite-wikipedia-to-remove-falsehoods-and-add-missing-context-elon-musk","status":"publish","type":"post","link":"https:\/\/farhan.marketingcie.com\/?p=244","title":{"rendered":"Grok can rewrite Wikipedia to remove falsehoods and add missing context: Elon Musk"},"content":{"rendered":"\n<p><a href=\"https:\/\/economictimes.indiatimes.com\/panache\/panache-people-101\/elon-musk\/profileshow\/79257472.cms\" target=\"_blank\" rel=\"noreferrer noopener\">Elon Musk<\/a>&nbsp;laid out plans to improve&nbsp;<a href=\"https:\/\/economictimes.indiatimes.com\/topic\/xai\" target=\"_blank\" rel=\"noreferrer noopener\">xAI<\/a>&nbsp;chatbot&nbsp;<a href=\"https:\/\/economictimes.indiatimes.com\/topic\/grok\" target=\"_blank\" rel=\"noreferrer noopener\">Grok<\/a>&nbsp;by cleaning up training data with synthetic corrections. In the recent All-in Podcast episode, Musk said Grok uses &#8220;a lot of inference compute and reasoning to look at all the source data&#8221; including&nbsp;<a href=\"https:\/\/economictimes.indiatimes.com\/topic\/wikipedia\" target=\"_blank\" rel=\"noreferrer noopener\">Wikipedia<\/a>, books, and websites.<\/p>\n\n\n\n<p>&#8220;So, take Wikipedia as an example\u2014though this applies to books, PDFs, websites, every form of information\u2014the&nbsp;<a href=\"https:\/\/economictimes.indiatimes.com\/topic\/grok-model\" target=\"_blank\" rel=\"noreferrer noopener\">Grok model<\/a>&nbsp;uses heavy inference to look at a Wikipedia page and say, \u2018What is true, partially true, false, or missing in this page?\u2019 Now rewrite the page to remove the falsehoods, correct the half-truths, and add missing context,\u201d Musk said, appearing on the tech podcast hosted by venture capitalists Chamath Palihapitiya, Jason Calacanis, David Sacks, and David Friedberg.<\/p>\n\n\n\n<p>When asked about the idea of publishing such cleaned-up knowledge for the world, Musk said, \u201cI&#8217;ll talk to the team about that, like a Grokipedia or whatever. It\u2019d be interesting.\u201d<\/p>\n\n\n\n<p>Musk&#8217;s tiff with Wikipedia goes back to 2023, when Jimmy Wales, founder of the online encyclopedia, said at Web Summit in Lisbon, &#8220;I\u2019m pretty happy that [large language models] are reading Wikipedia and not just Elon Musk\u2019s Twitter; it\u2019s not really a great source of truth.&#8221; Musk had then called for his supporters to \u201cdefund Wikipedia\u201d and accused the platform of bias, calling it an \u201cextension of legacy media propaganda.\u201d<\/p>\n\n\n\n<p>Looking ahead, Musk said, \u201cWe might have AI smarter than any single human at anything as soon as next year. Probably by 2030, AI is smarter than the sum of all humans.\u201d<\/p>\n\n\n\n<p>The xAI chief also highlighted his view about AI being not \u201ca destination but part of the overall escalation of intelligence\u201d.<\/p>\n\n\n\n<p>Musk is betting on Grok coming out on top in the competitive AI race. His companies xAI and X earlier this month filed a sweeping US antitrust lawsuit against Apple and OpenAI, alleging the tech giants formed an illegal partnership to stifle competition in the AI and smartphone markets.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Elon Musk&nbsp;laid out plans to improve&nbsp;xAI&nbsp;chatbot&nbsp;Grok&nbsp;by cleaning up training data with synthetic corrections. In the recent All-in Podcast episode, Musk said Grok uses &#8220;a lot of inference compute and reasoning to look at all the source data&#8221; including&nbsp;Wikipedia, books, and websites. &#8220;So, take Wikipedia as an example\u2014though this applies to books, PDFs, websites, every form [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[5],"tags":[],"class_list":["post-244","post","type-post","status-publish","format-standard","hentry","category-most-popular"],"_links":{"self":[{"href":"https:\/\/farhan.marketingcie.com\/index.php?rest_route=\/wp\/v2\/posts\/244","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/farhan.marketingcie.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/farhan.marketingcie.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/farhan.marketingcie.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/farhan.marketingcie.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=244"}],"version-history":[{"count":1,"href":"https:\/\/farhan.marketingcie.com\/index.php?rest_route=\/wp\/v2\/posts\/244\/revisions"}],"predecessor-version":[{"id":245,"href":"https:\/\/farhan.marketingcie.com\/index.php?rest_route=\/wp\/v2\/posts\/244\/revisions\/245"}],"wp:attachment":[{"href":"https:\/\/farhan.marketingcie.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=244"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/farhan.marketingcie.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=244"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/farhan.marketingcie.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=244"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}