<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The Future of Being Human: Modem Futura]]></title><description><![CDATA[A weekly podcast guide to the future of science, technology, and society from the ASU Future of Being Human initiative, hosted by Sean Leahy and Andrew Maynard. Listen wherever you get your podcasts.]]></description><link>https://www.futureofbeinghuman.com/s/modem-futura</link><generator>Substack</generator><lastBuildDate>Sat, 11 Apr 2026 03:39:29 GMT</lastBuildDate><atom:link href="https://www.futureofbeinghuman.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Andrew Maynard]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[andrewmaynard@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[andrewmaynard@substack.com]]></itunes:email><itunes:name><![CDATA[Andrew Maynard]]></itunes:name></itunes:owner><itunes:author><![CDATA[Andrew Maynard]]></itunes:author><googleplay:owner><![CDATA[andrewmaynard@substack.com]]></googleplay:owner><googleplay:email><![CDATA[andrewmaynard@substack.com]]></googleplay:email><googleplay:author><![CDATA[Andrew Maynard]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[What students *really* think about artificial intelligence]]></title><description><![CDATA[My colleague Sean and I sat down with two ASU undergrads for a candid conversation about all things AI. It's a conversation anyone working in higher education will want to hear.]]></description><link>https://www.futureofbeinghuman.com/p/what-students-really-think-about-ai</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/what-students-really-think-about-ai</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 06 May 2025 13:18:56 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!hrM9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!hrM9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!hrM9!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png 424w, https://substackcdn.com/image/fetch/$s_!hrM9!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png 848w, https://substackcdn.com/image/fetch/$s_!hrM9!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png 1272w, https://substackcdn.com/image/fetch/$s_!hrM9!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!hrM9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png" width="1456" height="816" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:816,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:6966561,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://futureofbeinghuman.com/i/162915611?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!hrM9!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png 424w, https://substackcdn.com/image/fetch/$s_!hrM9!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png 848w, https://substackcdn.com/image/fetch/$s_!hrM9!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png 1272w, https://substackcdn.com/image/fetch/$s_!hrM9!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9519b99b-a6f6-4a80-a753-e39a81f0bce0_3224x1806.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Recording this week&#8217;s episode of Modem Futura in the Future of Being Human initiative space at ASU. From left to right, Andrew Maynard (me), Bella Faria, Caleb Lieberman, and Sean Leahy</figcaption></figure></div><p>Everybody, it seems &#8212; at least everyone who <em>isn&#8217;t</em> a student &#8212; has an opinion about students and AI. Yet it&#8217;s surprising how rarely we actually talk with students themselves about what they think about artificial intelligence and how they see it impacting their lives.</p><p>So to redress the balance &#8212; at least a little bit &#8212; my co-host Sean Leahy and I set out to have a conversation with a couple of undergrads about their thoughts and experiences around AI in this week&#8217;s episode of <em><a href="https://futureofbeinghuman.asu.edu/2024/10/09/modem-futura-podcast/">Modem Futura</a></em>.</p><p>Admittedly, our sample size was rather small. And to make it worse, both our guests are studying for a degree that&#8217;s built around understanding and exploring the relationship between technology, society, and the future &#8212; and so they were primed for the types of conversations we have on the podcast. </p><p>Yet much of what came out as we chatted reflects what Sean and I have heard from other students &#8212; usually when they feel comfortable enough to talk about what they <em>really</em> think, rather than what they think we want to hear.</p><p>Of course, this isn&#8217;t the only example of students talking about their experiences with AI. In the fall of 2024 for instance, researchers at Harvard published a report on <a href="https://digitalthriving.gse.harvard.edu/wp-content/uploads/2024/06/Teen-and-Young-Adult-Perspectives-on-Generative-AI.pdf">teen and young adult perspectives on Generative AI</a> that indicated cautious but growing adoption. And in February of this year the journal PLOS One <a href="https://doi.org/10.1371/journal.pone.0315011">published the results of an international survey</a> of over 23,000 students on their thoughts about, attitudes toward, and uses of ChatGPT. </p><p>But there&#8217;s a difference between reading the results of a survey and listening to students candidly describe their experiences . And this, to me, made our conversation both compelling and humbling.</p><div id="youtube2-2es4mYQldnA" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;2es4mYQldnA&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/2es4mYQldnA?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p><em>As well as listening to the podcast (see below), you can also watch our conversation on YouTube</em></p><p>Bella Faria and Caleb Lieberman are both sophomores in ASU&#8217;s <a href="https://sfis.asu.edu/">School for the Future of Innovation in Society</a>. Because of this they already have a pretty sophisticated perspective on how transformative technologies play out in society.</p><p>But they&#8217;re also students who are at the sharp end of juggling demanding courses &#8212; and sometimes even more demanding instructors &#8212; while trying to wring as much value as possible out of pursuing a degree while thinking about what comes next. </p><p>They are also both ChatGPT-natives in that they&#8217;ve never experienced university life as a student <em>without</em> access to generative AI. And this came through clearly as we chatted.</p><p>As this was such a rich conversation it&#8217;s worth listening to (or watching) it in full. But just in case you&#8217;re pushed for time, I&#8217;ve also included some of my personal top-level takeaways below.</p><div class="apple-podcast-container" data-component-name="ApplePodcastToDom"><iframe class="apple-podcast " data-attrs="{&quot;url&quot;:&quot;https://embed.podcasts.apple.com/us/podcast/ai-on-campus-how-gen-z-is-redefining-college-with-chatgpt/id1771688480?i=1000706492580&quot;,&quot;isEpisode&quot;:true,&quot;imageUrl&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/podcast-episode_1000706492580.jpg&quot;,&quot;title&quot;:&quot;AI on Campus: How Gen Z Is Redefining College with ChatGPT&quot;,&quot;podcastTitle&quot;:&quot;Modem Futura&quot;,&quot;podcastByline&quot;:&quot;&quot;,&quot;duration&quot;:3896000,&quot;numEpisodes&quot;:&quot;&quot;,&quot;targetUrl&quot;:&quot;https://podcasts.apple.com/us/podcast/ai-on-campus-how-gen-z-is-redefining-college-with-chatgpt/id1771688480?i=1000706492580&amp;uo=4&quot;,&quot;releaseDate&quot;:&quot;2025-05-06T08:00:00Z&quot;}" src="https://embed.podcasts.apple.com/us/podcast/ai-on-campus-how-gen-z-is-redefining-college-with-chatgpt/id1771688480?i=1000706492580" frameborder="0" allow="autoplay *; encrypted-media *;" allowfullscreen="true"></iframe></div><p>The episode is also available on the usual platforms (<a href="https://podcasts.apple.com/us/podcast/ai-on-campus-how-gen-z-is-redefining-college-with-chatgpt/id1771688480?i=1000706492580">Apple Podcasts</a>, <a href="https://open.spotify.com/episode/38Z6qbEkR8mCWvT9j2Au1H">Spotify</a>, <a href="https://www.youtube.com/watch?v=2es4mYQldnA">YouTube</a>)and wherever you get your podcasts. </p><h2>Some of my key takeaways from our conversation  </h2><p>First off I should say that I loved this conversation. Bella and Caleb were smart, candid, funny, and insightful. And I came away having learned a lot from them. </p><p>It was also wide ranging, which makes it hard to slice and dice into a set of neat takeaways.</p><p>But let me try anyway &#8212; remembering that these are a mere shadow of the full conversation between the four of us:</p><h4>Peer-peer learning</h4><p>ASU isn&#8217;t shy about making AI tools available to faculty and students. Yet I was fascinated to hear that our students tend to go their own uninstitutional way here &#8212; and learn from each other what&#8217;s hot and what&#8217;s not. </p><p>I loved the image from the podcast of students peering over classmates&#8217; shoulders to see what AI tools they&#8217;re using in class, or swapping stories about what platform works best for what applications (or assignments).</p><p>This peer-peer diffusion of AI-related knowledge and use is part of how learning should work &#8212; at least in my books. But it does throw up challenges for instructors who aren&#8217;t part of these fast and fluid informal learning networks, and are falling behind as a result. </p><p>And it does, of course, raise serious questions around what AI literacy means in programs where the students are several steps ahead of the institution. </p><h4>Creative innovation rules</h4><p>Closely associated with learning through diffusion, I really liked the examples that Caleb and Bella gave of how they&#8217;re creatively innovating with how they use AI. </p><p>If you listen to the podcast there&#8217;s a wonderful section where Caleb describes walking around campus with his phone to his ear &#8212; not taking calls, but talking with his personalized AI and asking it about every imaginable topic under the sun.</p><p>It&#8217;s a form of informal and self-directed learning that transcends the formal structures imposed by classes and degree programs. And yet it&#8217;s one that, hearing Caleb talk about it, feeds off students being immersed in a stimulating academic environment.</p><p>Both Caleb and Bella talk in the podcast about further creative uses of AI. Using it as their &#8220;second brain&#8221; for instance, or constantly having an AI app open in a tab in their browser, and asking AI to summarize class readings as an on-demand (but vastly superior) modern day version of CliffsNotes.</p><h4>AI innovation is present in some classes, but definitely not in others</h4><p>There&#8217;s a great section in the podcast where Caleb talks about an instructor using Google&#8217;s NotebookLM to create AI podcasts that cover key class topics. It&#8217;s a good example of an imaginative use of available AI platforms in the classroom, and clearly one that was well-received.</p><p>More broadly, Caleb and Bella talk of AI as a &#8220;meaning maker&#8221; &#8212; a technology that&#8217;s allowing them to discover and explore meaning beyond the confines of their formal education &#8212; and the expertise of their instructors.</p><p>At the same time, there was a clear sense that AI is not welcomed in some courses. And part of the informal AI literacy skills that students are learning is when to avoid any mention of AI in some classes, and when it&#8217;s OK to be open about how they&#8217;re using it.</p><p>Students, it seems, are learning to AI code-switch depending on whose class they&#8217;re in!</p><h4>Cheating</h4><p>Of course, the specter of AI-enabled cheating came up in our conversation. Here Bella and Caleb were clear that, as far as they can tell, relatively few students cheat. For those who do use the technology as a short cut to passing classes, it was acknowledged that AI makes such behavior easier. But they also pointed out that, where students set out to extract as much value as possible from their degree, AI makes this easier as well.</p><p>Paraphrasing them both, whatever your educational values are as a student, AI will reveal and amplify these.</p><p>I especially appreciated the insight that if you cheat, you're really cheating yourself &#8212; and that there are students who will use AI to cheat themselves out of a degree, and those that will use it to expand their education.</p><h4>The value of a university education</h4><p>As we talked, I was interested in what value both Caleb and Bella saw in their degrees and at being at ASU. </p><p>It wasn&#8217;t an idle question &#8212; the value that universities bring to society is currently under scrutiny, and as AI makes personalized learning increasingly accessible, it&#8217;s getting harder to justify the costs of getting a degree.</p><p>I was blown away by both of their responses, although you&#8217;ll need to listen to the podcast to capture the full depth and nuance of our conversation.</p><p>From Caleb:</p><blockquote><p>&#8220;[T]he reason I was compelled to come to the College of Global Futures [at ASU] is I was really interested in what are the global issues. What are the risks in the world that are shaping the [current and future] landscape. And then I decided on the School for the Future of Innovation. The reason is because, you know, what are the tools that, and the innovations [we need], in order to address some of the global challenges in the world.</p><p>And so, for me &#8230; if AI could do something of, and take all the tools and make the difference in the world, then it already &#8230; you know, I need to have personal agency.  And so, for me, if the learning is of value and it's actually, this is something I can take into my life and bring out into making an impact, then it's going to be &#8230; what's the value of using AI to achieve that &#8230;</p><p>The mission is not to get the degree. It's to ultimately go out into the world and do something with what I've learned.&#8221;</p></blockquote><p>And from Bella:</p><blockquote><p>&#8220;My education has shaped who I am, and what I want to achieve. The College of Global Futures is perfect for that, you know. We address complex issues that I believe everybody can notice, that the world is facing and, you know, with every program and class I'm able to take within the College it shapes who I am, and shapes my ultimate purpose that I will continuously be cultivating throughout my life, and whatever career path I decide to go down.</p></blockquote><p>I was almost tearing up at this point!</p><h4>Looking to the future</h4><p>As we finished the conversation I asked both Caleb and Bella how they were thinking about the future and in this new era of AI.</p><p>You&#8217;ll have to listen to the podcast for their responses &#8212; which are both inspiring and hopeful.  But I did want to leave the last word here to Bella, and something she asked leading into this final part of the recording:</p><p>&#8220;How can we navigate the future if we're not currently exposed to it?&#8221;</p><p>This struck such a deep chord with me as it&#8217;s at the core of what I do as an academic, professor, and writer. </p><p>How indeed can anyone hope to be part of navigating toward the future if we&#8217;re unaware of how the technologies around us are transforming it? And how can we expect our students to be part of imagining and designing the types of future we aspire to if they have no idea how the present waves of innovation are intertwined with who we are and where we hope to go?</p><p>That our students recognize this gives me hope that we&#8217;re doing something right here. But it does put the onus back on us as educators to create learning environments that expose our students to the future &#8212; and to help them understand and figure out how they can be a part of successfully navigating it.</p><p></p><p></p><p></p><p></p><p></p><p></p><p></p><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[Beyond the Hero’s Journey: AI, Protopia & the Future of Film]]></title><description><![CDATA[On this week's Model Futura, filmmaker, futurist, and impact storyteller Taryn O'Neill joins is to talk about AI&#8209;powered moviemaking and how the stories we tell now script the futures we get]]></description><link>https://www.futureofbeinghuman.com/p/beyond-the-heros-journey-ai-protopia</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/beyond-the-heros-journey-ai-protopia</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 22 Apr 2025 13:08:12 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!BqOy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!BqOy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!BqOy!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png 424w, https://substackcdn.com/image/fetch/$s_!BqOy!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png 848w, https://substackcdn.com/image/fetch/$s_!BqOy!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png 1272w, https://substackcdn.com/image/fetch/$s_!BqOy!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!BqOy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:4414142,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://futureofbeinghuman.com/i/161848487?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!BqOy!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png 424w, https://substackcdn.com/image/fetch/$s_!BqOy!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png 848w, https://substackcdn.com/image/fetch/$s_!BqOy!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png 1272w, https://substackcdn.com/image/fetch/$s_!BqOy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe37157c8-b739-4f28-951c-c43bf3469d60_2688x1536.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image: Midjourney</figcaption></figure></div><p>This week on <em>Modem Futura</em> Sean and I are joined by the brilliant <a href="https://linktr.ee/tarynoneill">Taryn O&#8217;Neill</a> to talk about all things movies, storytelling, and AI. </p><p>Taryn&#8217;s a filmmaker, futurist, and impact storyteller whose work bridges entertainment, emerging technology, and climate action. She&#8217;s also an industry insider who looks far beyond Hollywood stereotypes and tropes as she explores how we can begin to explore positive futures through the lens of cinema.</p><p>To get a sense of the breadth of the conversation (which truly awesome), check out the themes and entry points below &#8212; or just jump straight in:</p><iframe class="spotify-wrap podcast" data-attrs="{&quot;image&quot;:&quot;https://i.scdn.co/image/ab6765630000ba8a1a907178887585c6af3860bc&quot;,&quot;title&quot;:&quot;Beyond the Hero&#8217;s Journey: AI, Protopia &amp; the Future of Film with Taryn O'Neill&quot;,&quot;subtitle&quot;:&quot;Sean Leahy, Andrew Maynard&quot;,&quot;description&quot;:&quot;Episode&quot;,&quot;url&quot;:&quot;https://open.spotify.com/episode/67Xie3Cv4wfrkytHZrUMUG&quot;,&quot;belowTheFold&quot;:false,&quot;noScroll&quot;:false}" src="https://open.spotify.com/embed/episode/67Xie3Cv4wfrkytHZrUMUG" frameborder="0" gesture="media" allowfullscreen="true" allow="encrypted-media" data-component-name="Spotify2ToDOM"></iframe><p>(You can also listen on <a href="https://podcasts.apple.com/us/podcast/modem-futura/id1771688480">Apple Podcasts</a> or on <a href="https://open.spotify.com/episode/67Xie3Cv4wfrkytHZrUMUG">Spotify</a> (as above), or <a href="https://www.youtube.com/watch?v=A5dhstFi8uI">YouTube</a>.)</p><h2>Entry points from ChatGPT o3:</h2><h4>Streaming Up&#8209;Ends Hollywood&#8217;s Business Model</h4><p><strong>Approximate timestamp:</strong> ~<a href="https://open.spotify.com/episode/67Xie3Cv4wfrkytHZrUMUG?t=517">08:37</a></p><p>Sean, Andrew and Taryn trace how Netflix&#8217;s DVD&#8209;by&#8209;mail morphed into the &#8220;all&#8209;you&#8209;can&#8209;stream&#8221; era and shattered the old scarcity model. They unpack the collapse of lucrative residuals for actors and writers, the new (and far lower) streaming&#8209;only union contracts, and how shrinking theatrical windows have flipped risk from studios to creatives.</p><div><hr></div><h4>Will Going to the Movies Survive?</h4><p><strong>Approximate timestamp:</strong> ~<a href="https://open.spotify.com/episode/67Xie3Cv4wfrkytHZrUMUG?t=1024">17:04</a></p><p>The discussion moves to whether cinemas are becoming a niche &#8220;ritual.&#8221; Topics include three&#8209;hour epics vs. home viewing, second&#8209;screen distractions, forced intermissions, and what still pulls people into an IMAX auditorium (e.g., <em>Oppenheimer</em>). The conversation highlights the tension between communal spectacle and sofa convenience.</p><div><hr></div><h4>Interactive &amp; Second&#8209;Screen Storytelling</h4><p><strong>Approximate timestamp:</strong> ~<a href="https://open.spotify.com/episode/67Xie3Cv4wfrkytHZrUMUG?t=1342">22:22</a></p><p>Using <em>M3GAN 2.0</em>&#8217;s new Instagram chatbot as a case study, Taryn, Sean and Andrew explore &#8220;lean&#8209;in&#8221; horror, Meta&#8217;s plans for in&#8209;theater phone interaction, and the roots of transmedia series like <em>The Guild</em>. The conversation swings between excitement over community&#8209;driven narratives and fears of defiling &#8220;the temple of cinema.&#8221;</p><div><hr></div><h4>AI in Production: From Cost&#8209;Cutter to Creative Partner</h4><p><strong>Approximate timestamp:</strong> ~<a href="https://open.spotify.com/episode/67Xie3Cv4wfrkytHZrUMUG?t=1960">32:40</a></p><p>Taryn explains why $200 M blockbusters are unsustainable and how generative tools&#8212;from auto&#8209;rotoscoping wigs to ethical text&#8209;to&#8209;video models&#8212;can slash VFX timelines, plan shoots, and open doors for filmmakers with only a laptop. James Cameron&#8217;s &#8220;cut the budget in half&#8221; mantra, plus emerging AI&#8209;native studios, frame the stakes. [As an aside, speakers weren&#8217;t identified in the transcript ChatGPT was working from, so impressive that it identified this as Taryn speaking!]</p><div><hr></div><h4>Protopia, Not Dystopia&#8212;Re&#8209;imagining the Future Through Story</h4><p><strong>Approximate timestamp:</strong> ~<a href="https://open.spotify.com/episode/67Xie3Cv4wfrkytHZrUMUG?t=3459">57:39</a></p><p>The last act pivots to why optimistic, &#8220;protopian&#8221; futures are essential operating systems for humanity. The trio discuss using AI hallucinations to invent new narrative structures (spirals, double helices), personalized films, and teaching students to design the 20&#8209;year future they want. Story becomes a tool for agency rather than despair.</p><div><hr></div><p>As always you can watch the episode on YouTube as well as listening to us &#8212; check it out here.</p><div id="youtube2-nl9VC-I7Iis" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;nl9VC-I7Iis&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/nl9VC-I7Iis?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>If you find the podcast useful or think others will enjoy listening, please do share it with colleagues <a href="https://youtu.be/SX2kWYFm5Ww">and leave us a rating or review</a>.</p><p>Thanks!</p>]]></content:encoded></item><item><title><![CDATA[The Secret Lives of Electric Vehicle Owners]]></title><description><![CDATA[ASU professor and Tech Skeptic Goes Electric author Jamey Wetmore joins Modem Futura to discuss electric vehicle ownership, charging challenges, data privacy, and why informed skepticism matters.]]></description><link>https://www.futureofbeinghuman.com/p/navigating-the-shift-to-electric-vehicles</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/navigating-the-shift-to-electric-vehicles</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 15 Apr 2025 13:32:34 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!MDhV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!MDhV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!MDhV!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png 424w, https://substackcdn.com/image/fetch/$s_!MDhV!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png 848w, https://substackcdn.com/image/fetch/$s_!MDhV!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!MDhV!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!MDhV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1770173,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://futureofbeinghuman.com/i/161345772?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!MDhV!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png 424w, https://substackcdn.com/image/fetch/$s_!MDhV!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png 848w, https://substackcdn.com/image/fetch/$s_!MDhV!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!MDhV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe1839c37-5847-4177-9afd-c1f2b491377b_1792x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image: Midjourney</figcaption></figure></div><p>For this week&#8217;s episode of <em>Modem Futura</em>, I&#8217;m thrilled to be joined by my long-time friend and colleague <a href="https://search.asu.edu/profile/845730">Jamey Wetmore</a>, alongside co-host Sean Leahy, for a conversation about the real-world challenges and unexpected insights that come with making teh transition from gas-powered cars to electric vehicles.</p><p>Jamey&#8217;s a professor in ASU&#8217;s School for the Future of Innovation in Society and a leading thinker on the complex relationship between emerging technologies and society. His work sits at the intersection of tech, ethics, and public decision-making. But it also turns out that he&#8217;s also a bit of a car geek, with a deep knowledge of automotive history and our cultural connection to cars.</p><p>He&#8217;s also the author of the Substack <em><a href="https://techskepticgoeselectric.substack.com/">Tech Skeptic Goes Electric</a></em>, where he&#8217;s been chronicling his personal journey into EV ownership. Much of this week&#8217;s episode builds on his reflections there &#8212; offering a grounded and insightful (and occasionally skeptical) perspective on what it actually means to go electric.</p><p>Whether or not you&#8217;re into EVs, this a thoughtful, engaging, and often surprising discussion that I&#8217;d highly recommend giving a listen.</p><p>Listen on <a href="https://podcasts.apple.com/us/podcast/electronic-vehicles-and-the-future-of/id1771688480?i=1000703567924">Apple Podcasts</a> or on <a href="https://open.spotify.com/episode/74wQa497DGfd4VTvfYlaJb">Spotify</a> or <a href="https://www.youtube.com/watch?v=UesZat7CZ_8">YouTube</a>. And  as usual, if you want to jump to specific entry points, check out ChatGPT o1-Pro&#8217;s generated summary with approximate time stamps below.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></p><div class="apple-podcast-container" data-component-name="ApplePodcastToDom"><iframe class="apple-podcast " data-attrs="{&quot;url&quot;:&quot;https://embed.podcasts.apple.com/us/podcast/electronic-vehicles-and-the-future-of/id1771688480?i=1000703567924&quot;,&quot;isEpisode&quot;:true,&quot;imageUrl&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/podcast-episode_1000703567924.jpg&quot;,&quot;title&quot;:&quot;Electronic Vehicles and the Future of Transportation with Jamey Wetmore&quot;,&quot;podcastTitle&quot;:&quot;Modem Futura&quot;,&quot;podcastByline&quot;:&quot;&quot;,&quot;duration&quot;:4456000,&quot;numEpisodes&quot;:&quot;&quot;,&quot;targetUrl&quot;:&quot;https://podcasts.apple.com/us/podcast/electronic-vehicles-and-the-future-of/id1771688480?i=1000703567924&amp;uo=4&quot;,&quot;releaseDate&quot;:&quot;2025-04-15T08:00:00Z&quot;}" src="https://embed.podcasts.apple.com/us/podcast/electronic-vehicles-and-the-future-of/id1771688480?i=1000703567924" frameborder="0" allow="autoplay *; encrypted-media *;" allowfullscreen="true"></iframe></div><h2>Entry points from ChatGPT o1-Pro:</h2><h4>Transition to Electric Vehicles (EVs)</h4><p><strong>Approximate timestamp:</strong> ~<a href="https://open.spotify.com/episode/74wQa497DGfd4VTvfYlaJb?t=900">15:00</a></p><p>Andrew, Sean and Jamey discuss the practical experience and considerations of transitioning from internal combustion engine cars to electric vehicles, covering personal anecdotes, benefits, challenges, and the nuances of EV ownership.</p><div><hr></div><h4>Privacy and Data Concerns in Modern Vehicles</h4><p><strong>Approximate timestamp:</strong> ~<a href="https://open.spotify.com/episode/74wQa497DGfd4VTvfYlaJb?t=1420">23:40</a></p><p>The conversation addresses significant concerns regarding data privacy and monitoring associated with modern electric vehicles. This includes how automakers collect, use, and potentially misuse driver data.</p><div><hr></div><h4>Governance and Regulation of Vehicle Technology</h4><p><strong>Approximate timestamp:</strong> ~<a href="https://open.spotify.com/episode/74wQa497DGfd4VTvfYlaJb?t=2499">41:39</a></p><p>Sean, Jamey and Andrew talk about the role of government and private companies in regulating vehicle safety, technology standards, and charging infrastructure, highlighting debates around effective governance for common good vs. individual corporate interests.</p><div><hr></div><h4>Charging Infrastructure and Range Anxiety</h4><p><strong>Approximate timestamp:</strong> ~<a href="https://open.spotify.com/episode/74wQa497DGfd4VTvfYlaJb?t=3160">52:40</a></p><p>Jamey, Sean and Andrew discuss the state of charging infrastructure, the practicality and challenges of charging EVs, and personal strategies to manage range anxiety effectively.</p><div><hr></div><h4>Environmental and Societal Impacts of EV Adoption</h4><p><strong>Approximate timestamp:</strong> ~<a href="https://open.spotify.com/episode/74wQa497DGfd4VTvfYlaJb?t=3767">1:02:47</a></p><p>Andrew, Sean and Jamey explore the broader implications of electric vehicle adoption, touching on environmental sustainability, lifecycle management of EV batteries, and the societal adjustments required for widespread EV integration.</p><div><hr></div><p>As always you can watch the episode on YouTube as well as listening to us &#8212; <a href="https://www.youtube.com/watch?v=0WbRf-Yt27s">check it out here</a>.</p><div id="youtube2-0WbRf-Yt27s" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;0WbRf-Yt27s&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/0WbRf-Yt27s?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>If you find the podcast useful or think others will enjoy listening, please do share it with colleagues <a href="https://youtu.be/SX2kWYFm5Ww">and leave us a rating or review</a>.</p><p>Thanks!</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>These are the themes that OpenAI o1-Pro thought were interesting. Lightly edited. </p></div></div>]]></content:encoded></item><item><title><![CDATA[Ancient wolves, conservation futures, and one of the fastest growing biotech startups in history ]]></title><description><![CDATA[Colossal Biosciences has been grabbing headlines with claims of bringing the long-extinct dire wolf back from the dead. But how responsible is the company's vision of "commercial evolution?"]]></description><link>https://www.futureofbeinghuman.com/p/de-extinction-conservation-futures</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/de-extinction-conservation-futures</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Sun, 13 Apr 2025 13:37:35 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!FMGU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!FMGU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!FMGU!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!FMGU!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!FMGU!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!FMGU!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!FMGU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3796622,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://futureofbeinghuman.com/i/161188275?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!FMGU!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!FMGU!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!FMGU!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!FMGU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F731ecdb5-6224-4d5c-8621-62753ef17813_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Colossal&#8217;s dire wolf cub Remus as reimagined by ChatGPT 4o</figcaption></figure></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.perplexity.ai/search?q=Read%20the%20following%20article%20in%20its%20entirety%20and%20provide%20a%20bulleted%20list%20of%20the%20three%20top%20takeaways%3A%20https%3A%2F%2Ffutureofbeinghuman.com%2Fp%2Fde-extinction-conservation-futures%20Provide%20a%20link%20back%20to%20the%20original%20article%20at%20the%20end%20and%20ask%20whether%20the%20user%20has%20any%20follow%20up%20questions.&quot;,&quot;text&quot;:&quot;Top takeaways (generated by Perplexity)&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.perplexity.ai/search?q=Read%20the%20following%20article%20in%20its%20entirety%20and%20provide%20a%20bulleted%20list%20of%20the%20three%20top%20takeaways%3A%20https%3A%2F%2Ffutureofbeinghuman.com%2Fp%2Fde-extinction-conservation-futures%20Provide%20a%20link%20back%20to%20the%20original%20article%20at%20the%20end%20and%20ask%20whether%20the%20user%20has%20any%20follow%20up%20questions."><span>Top takeaways (generated by Perplexity)</span></a></p><p>In 2018 I wrote about &#8220;resurrection biology&#8221; in the book <em><a href="https://andrewmaynard.net/films-from-the-future/">Films from the Future</a></em>, and the rising interest in using preserved DNA to bring back extinct species.</p><p>This was well before the launch of <a href="https://colossal.com/">Colossal Biosciences</a> in 2021. But the company, founded by entrepreneur Ben Lamm and Harvard geneticist George Church, was a direct-line outgrowth of the science and technology I was writing about back then.</p><p>And what an outgrowth it&#8217;s been. </p><p>Colossal Biosciences is now one of the fastest growing biotech startups around, with a <a href="https://dallasinnovates.com/dallas-colossal-biosciences-becomes-texas-first-decacorn-securing-10-2b-valuation-with-series-c-funding/">reported valuation earlier this year</a> of $10.2 billion. And over the last few weeks it&#8217;s been raking the headlines in &#8212; first with news of a <a href="https://www.nature.com/articles/d41586-025-00684-1">genetically engineered &#8220;wooly mouse&#8221;</a> (part of the company&#8217;s research into recreating the wooly mammoth), and this past few days with the <a href="https://www.businesswire.com/news/home/20250407444322/en/Colossal-Announces-Worlds-First-De-Extinction-Birth-of-Dire-Wolves">widely reported announcement</a> of the "rebirth of the once extinct dire wolf.&#8221;</p><p>The rebirthing story is a bit of a misnomer as Colossal used targeted editing of a handful of gray wolf genes to mimic physical features of the long-extinct dire wolf &#8212; albeit using DNA sequences extracted from fossilized remains. The result was a trio of pups that show some of the defining traits of dire wolves &#8212; including size and musculature, hair color, texture, length, and coat patterning. But they are, at best, genetically altered gray wolves that looks something like a dire wolf might have.</p><p>Nevertheless, the scientific feat of precision gene editing that Colossal scientists pulled off is impressive, and one that underpins the company&#8217;s ambitions to save the world through &#8220;thoughtful disruptive conservation&#8221; &#8212; as well as make a lot of money along the way. </p><p>Colossal Biosciences calls its approach &#8220;<a href="https://colossal.com/technology/">commercial evolution</a>&#8221; &#8212; a concept central to its business model. But behind the phrase lies a broader vision and a set of capabilities that raise serious ethical and scientific questions.</p><p>And this is where the chapter on &#8220;resurrection biology&#8221; in <em>Films from the Future</em> comes in. </p><p>Not surprisingly &#8212; given that the book uses science fiction movies as jumping-off points to explore emerging technologies and their responsible development &#8212; the chapter riffs on Spielberg&#8217;s 1993 film <em>Jurassic Park</em>.</p><p>This is, of course, a classic cautionary tale about genetic engineering gone wrong. But beneath the Hollywood storytelling, it&#8217;s also a surprisingly nuanced film. And even though it&#8217;s now over 30 years old, many of the questions it raises feel even more relevant today than when it was made.</p><p>Because of its relevance to Colossal&#8217;s current work, I&#8217;ve included the full chapter below (spoilers included &#8212; just a heads-up). But before we get there, it&#8217;s worth reflecting a bit on the company&#8217;s ambitions: not just to &#8220;fix&#8221; extinction, but to reshape the world in a whole raft of other ways through the science and technology it develops.</p><h3>A colossal company for colossal challenges</h3><p>Colossal Biosciences is not a shy and retiring company. The vision of its founders and scientists is audacious &#8212; and is reflected through the world-changing ambitions that weave through the company&#8217;s <a href="https://colossal.com/">hyper-slick website</a>.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> This is a company that is setting out to to fundamentally change the future by taking a radical approach to conservation.</p><p>Take <a href="http://v">this statement for instance</a>:</p><blockquote><p>&#8220;For the first time in the history of humankind, we are in control of a science with the power to reverse and prevent biodiversity loss on a large scale. We can heal a hurting planet. We can protect the species living on it. We can ethically decipher and protect genetic codes. And we can begin to turn the clock back to a time when Earth lived and breathed more cleanly and naturally.</p><p>&#8220;This is not an option for us. It is an obligation known as thoughtful disruptive conservation.&#8221;</p></blockquote><p>This is a company that&#8217;s investing in conservation futures through re-imagining biology &#8212; and deciding that it&#8217;s time that smart humans to step up to the plate and show nature a thing or two.</p><p>The hubris here is palpable. And many scientists are pushing back on Colossal&#8217;s claims that de-extinction is the answer to mass extinction, or that the company&#8217;s technology is as transformative and as ethical as it claims.</p><p>And yet, as precision gene editing continues to merge with advanced AI capabilities that could open up pathways to massive multi-gene editing approaches, Colossal may be onto something &#8212; at least technologically.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a></p><p>And it may be that this <em>is</em> the technological future of conservation biology &#8212; not adhering to a nostalgic natural selection-centric past, but embracing a &#8220;commercial evolution&#8221;-driven future.</p><p>Colossal <a href="https://colossal.com/conservation/">makes an impassioned case</a> for such a conservation future. It claims it&#8217;s &#8220;deciphering yesterday to save tomorrow.&#8221; And to achieve this, the company is partnering with organizations across the world to make this so &#8212; including many academic and conservation partners. </p><p>It&#8217;s also partnering with and <a href="https://dallasinnovates.com/dallas-colossal-biosciences-becomes-texas-first-decacorn-securing-10-2b-valuation-with-series-c-funding/">receiving financial backing</a> from many organizations focused on commercial applications, including venture capital investors <a href="https://en.wikipedia.org/wiki/Animoca_Brands">Animoca Brands</a> (a Hong Kong-based software and venture capital company), and  <a href="https://dallasinnovates.com/mammoth-interest-the-cia-invests-in-dallas-based-colossal-biosciences/">In-Q-Tel</a> (or IQT) &#8212; the venture capital arm of the CIA. </p><p>Here the ambition is as audacious as it is elsewhere, comparing the company&#8217;s projected success <a href="https://colossal.com/technology/">to the Apollo mission</a> with the claim that &#8220;Much like the Apollo mission led to modern day mobile phones, our de-extinction discoveries will lead to a commercial evolution.&#8221;</p><p>Yet as the the possibility of re-engineering nature continues to grow &#8212; and the ambitions driving it continue to balloon &#8212; so do the questions around what <em>responsibility</em> means in an era of de-extinction.</p><p>And that brings us to <em>Jurassic Park</em> and chapter two of <em>Films from the Future</em>.</p><p>The chapter uses the 1993 movie as the starting point for exploring the science of using preserved DNA to bring back extinct species, and the responsibility that goes with this. Looking back, the movie is remarkably prescient about the very human challenges associated with how such a technology should be handled &#8212; much more so than subsequent films in the franchise. And while the company in the film is the fictitious InGen (which, of course, is nothing like Colossal at all) and the equivalent of dire wolves and wooly mammoths are resurrected dinosaurs, <em>Jurassic Park</em> does take a serious dive into the societal challenges surrounding what even Colossal refers to &#8220;resurrection biology:&#8221;</p><div><hr></div><h1>JURASSIC PARK: THE RISE OF RESURRECTION BIOLOGY</h1><p>&#8220;God help us, we&#8217;re in the hands of engineers!&#8221;</p><p>&#8212;Dr. Ian Malcolm</p><p><em>(From Chapter 2 of Films from the Future: The Technology and Morality of Sci Fi Movies. Andrew Maynard, November 2018. Mango Publishing)</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://chatgpt.com/?q=Tell%20me%20more%20about%20the%20book%20Films%20from%20the%20Future%3A%20The%20Technology%20and%20Morality%20of%20Sci%20Fi%20Movies%2C%20by%20Andrew%20Maynard.%20Please%20provide%20a%20complete%20overview%20of%20the%20book%20and%20a%20synopsis%20of%20it's%20aims%20and%20focus%2C%20as%20well%20as%20an%20overview%20of%20the%2014%20chapters.%20Include%20links%20to%20more%20information&quot;,&quot;text&quot;:&quot;More on Films from the Future (ChatGPT)&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://chatgpt.com/?q=Tell%20me%20more%20about%20the%20book%20Films%20from%20the%20Future%3A%20The%20Technology%20and%20Morality%20of%20Sci%20Fi%20Movies%2C%20by%20Andrew%20Maynard.%20Please%20provide%20a%20complete%20overview%20of%20the%20book%20and%20a%20synopsis%20of%20it's%20aims%20and%20focus%2C%20as%20well%20as%20an%20overview%20of%20the%2014%20chapters.%20Include%20links%20to%20more%20information"><span>More on Films from the Future (ChatGPT)</span></a></p><h2>When Dinosaurs Ruled the World</h2><p>I was a newly minted PhD when I first saw <em>Jurassic Park</em>. It was June 1993, and my wife and I were beginning to enjoy our newfound freedom, after years of too much study and too little money. I must confess that we weren&#8217;t dinosaur geeks. But there was something about the hype surrounding the movie that hooked us. Plus, we fancied a night out.</p><p>That summer, dinosaurs ruled the world. Wherever you looked, there were dinosaurs. Dinosaur books, dinosaur parks, dinosaurs on TV, dinosaur-obsessed kids. <em>Jurassic Park</em> seemingly tapped into a dinosaur-obsessed seam buried deep within the human psyche. This was helped along, of course, by the groundbreaking special effects the movie pioneered. Even now, there&#8217;s a visceral realism to the blended physical models and computer-generated images that brings these near-mythical creatures to life in the movie.</p><p>This is a large part of the appeal of Jurassic Park. There&#8217;s something awe-inspiring&#8212;<em>awe-full</em> in the true sense of the word&#8212;about these &#8220;terrible lizards&#8221; that lived millions of years ago, and that are utterly alien to today&#8217;s world. This sense of awe runs deep through the movie. Listening to John Williams&#8217; triumphant theme music, it doesn&#8217;t take much to realize that under the gloss of danger and horror, Jurassic Park is at heart a celebration of the might and majesty of the natural world.</p><p><em>Jurassic Park</em> is unabashedly a movie about dinosaurs. But it&#8217;s also a movie about greed, ambition, genetic engineering, and human folly&#8212;all rich pickings for thinking about the future, and what could possibly go wrong.</p><div><hr></div><p><em>Jurassic Park</em> opens at a scientific dig in Montana, where paleontologists Alan Grant (played by Sam Neill) and Ellie Sattler (Laura Dern) are leading a team excavating dinosaur fossils. Just as the team discovers the fossilized skeleton of a velociraptor, a dinosaur that Grant is particularly enamored with, the dig is interrupted by the charming, mega-rich, and, as it turns out, rather manipulative John Hammond (Richard Attenborough). As well as being founder of International Genetic Technologies Incorporated (InGen for short), Hammond has also been backstopping Grant and Sattler&#8217;s digs. On arriving, he wastes no time offering them further funding in exchange for a quick weekend mini-break to his latest and greatest masterpiece, just off the coast of Costa Rica.</p><p>We quickly learn that, beneath the charm, Hammond is fighting for the future of his company and his dream of building the ultimate tourist attraction. There&#8217;s been an unfortunate incident between a worker and one of his park&#8217;s exhibits, and his investors are getting cold feet. What he needs is a couple of respected scientists to give him their full and unqualified stamp of approval, which he&#8217;s sure they will, once they see the wonders of his &#8220;Jurassic Park.&#8221;</p><p>Grant and Sattler agree to the jaunt, in part because their curiosity has been piqued. They join Hammond, along with self-styled &#8220;chaotician&#8221; Dr. Ian Malcolm (Jeff Goldblum) and lawyer Donald Gennaro (Martin Ferrero), on what turns out to be a rather gruesome roller-coaster ride of a weekend.</p><p>From the get-go, we know that this is not going to end well. Malcolm, apart from having all the best lines in the movie, is rather enamored with his theories about chaos. These draw heavily on ideas that were gaining popularity in the 1980s, when Crichton was writing the novel the movie&#8217;s based on. Malcolm&#8217;s big idea&#8212;and the one he was riding the celebrity-scientist fame train on&#8212;is that in highly complex systems, things inevitably go wrong. And just as predicted, Hammond&#8217;s Jurassic Park undergoes a magnificently catastrophic failure.</p><p>The secret behind Hammond&#8217;s park is InGen&#8217;s technology for &#8220;resurrecting&#8221; long-extinct dinosaurs. Using cutting-edge gene-editing techniques, his scientists are able to reconstruct dinosaurs from recovered &#8220;dino DNA.&#8221; His source for the dino DNA is the remnants of prehistoric blood that was sucked up by mosquitoes before they were caught in tree resin and preserved in the resulting amber as the resin was fossilized.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> And his grand plan is to turn the fictitious island of Isla Nublar into the world&#8217;s first living dinosaur theme park.</p><p>Unfortunately, there were a few holes in the genetic sequences that InGen was able to extract from the preserved blood, so Hammond&#8217;s enterprising scientists filled them with bits and pieces of DNA from living species. They also engineered their dinosaurs to be all females to prevent them from breeding. And just to be on the safe side, the de-extinct dinosaurs were designed to slip into a coma and die if they weren&#8217;t fed a regular supply of the essential amino acid lysine.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a></p><p>The result is a bunch of enterprising scientists reengineering nature to create the ultimate theme park and thinking they&#8217;ve put all the safeguards they need in place to prevent something bad happening. Yet, despite their best efforts, the dinosaurs start breeding and multiplying, a compromised security system (and security specialist) allows them to escape, and they start eating the guests.</p><p>Even before the team of experts get to Jurassic Park, a disgruntled employee (Dennis Nedry, played by Wayne Knight) has planned to steal and sell a number of dinosaur embryos to a competitor. Nedry is the brains behind the park&#8217;s software control systems and believes he&#8217;s owed way more respect and money than he gets. At an opportune moment, he disrupts the park with what he intends to be a temporary glitch that will allow him to steal the embryos, get them off the island, and return to his station before anyone notices. Unfortunately, an incoming hurricane<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a> interferes with his plans, resulting in catastrophic failure of the park&#8217;s security systems and a bunch of hungry dinosaurs roaming free. To make things worse, two of the guests are Hammond&#8217;s young nephew and niece, who find their trip to the theme park transformed into a life-and-death race against a hungry <em>Tyrannosaurus rex</em> and a pack of vengeful velociraptors.</p><p>Fortunately, Sattler and Grant come into their own as paleontologists-<em>cum</em>-action-heroes. They help save a handful of remaining survivors, including Hammond, Malcolm, and his nephew and niece, but not before a number of less fortunate characters have given their lives in the name of science gone badly wrong. And as they leave the island, we are left in no doubt that nature, in all its majesty, has truly trounced the ambitions of Hammond and his team of genetic engineers.</p><div><hr></div><p><em>Jurassic Park</em> is a wonderful Hollywood tale of derring-do. In fact, it stands the test of time remarkably well as an adventure movie. It also touches on themes that are, if anything, more important today than they were back when it was made.</p><p>In 1993, when <em>Jurassic Park</em> was released, the idea of bringing extinct species back from the dead was pure science fiction. Back then, advances in understanding DNA were fueling the fantasy that, one day, we might be able to recode genetic sequences to replicate species that are no longer around, but but, by any stretch of the imagination, this was beyond the wildest dreams of scientists in the early 1990s. Yet, since the movie was made, there have been incredible strides in genetic engineering, so much so that scientists are now actively working on bringing back extinct species from the dead. The field even has its own name: de-extinction.</p><p>More than the technology, though, <em>Jurassic Park</em> foreshadows the growing complexities of using powerful new technologies in an increasingly crowded and demanding world. In 1993, chaos theory was still an emerging field. Since then, it&#8217;s evolved and expanded to include whole areas of study around complex systems, especially where mixing people and technology together leads to unpredictable results.</p><p>What really stands out with Jurassic Park, over twenty-five years later, is how it reveals a very human side of science and technology. This comes out in questions around when we should tinker with technology and when we should leave well enough alone. But there is also a narrative here that appears time and time again with the movies in this book, and that is how we get our heads around the sometimes oversized roles mega-entrepreneurs play in dictating how new tech is used, and possibly abused.</p><p>These are all issues that are just as relevant now as they were in 1993, and are front and center of ensuring that the technology-enabled future we&#8217;re building is one where we want to live, and not one where we&#8217;re constantly fighting for our lives.</p><div><hr></div><h2>De-Extinction</h2><p>In a far corner of Siberia, two Russians&#8212;Sergey Zimov and his son Nikita&#8212;are attempting to recreate the Ice Age. More precisely, their vision is to reconstruct the landscape and ecosystem of northern Siberia in the Pleistocene, a period in Earth&#8217;s history that stretches from around two and a half million years ago to eleven thousand years ago. This was a time when the environment was much colder than now, with huge glaciers and ice sheets flowing over much of the Earth&#8217;s northern hemisphere. It was also a time when humans coexisted with animals that are long extinct, including saber-tooth cats, giant ground sloths, and woolly mammoths.</p><p>The Zimovs&#8217; ambitions are an extreme example of &#8220;Pleistocene rewilding,&#8221; a movement to reintroduce relatively recently extinct large animals, or their close modern-day equivalents, to regions where they were once common. In the case of the Zimovs, the father-and-son team believe that, by reconstructing the Pleistocene ecosystem in the Siberian steppes and elsewhere, they can slow down the impacts of climate change on these regions. These areas are dominated by permafrost, ground that never thaws through the year. Permafrost ecosystems have developed and survived over millennia, but a warming global climate (a theme we&#8217;ll come back to in chapter twelve and the movie The Day After Tomorrow) threatens to catastrophically disrupt them, and as this happens, the impacts on biodiversity could be devastating. But what gets climate scientists even more worried is potentially massive releases of trapped methane as the permafrost disappears.</p><p>Methane is a powerful greenhouse gas&#8212;some eighty times more effective at exacerbating global warming than carbon dioxide&#8212;and large-scale releases from warming permafrost could trigger catastrophic changes in climate. As a result, finding ways to keep it in the ground is important. And here the Zimovs came up with a rather unusual idea: maintaining the stability of the environment by reintroducing long-extinct species that could help prevent its destruction, even in a warmer world. It&#8217;s a wild idea, but one that has some merit.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a> As a proof of concept, though, the Zimovs needed somewhere to start. And so they set out to create a park for de- extinct Siberian animals: Pleistocene Park.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a></p><p>Pleistocene Park is by no stretch of the imagination a modern-day <em>Jurassic Park</em>. The dinosaurs in Hammond&#8217;s park date back to the Mesozoic period, from around 250 million years ago to sixty-five million years ago. By comparison, the Pleistocene is relatively modern history, ending a mere eleven and a half thousand years ago. And the vision behind Pleistocene Park is not thrills, spills, and profit, but the serious use of science and technology to stabilize an increasingly unstable environment. Yet there is one thread that ties them together, and that&#8217;s using genetic engineering to reintroduce extinct species. In this case, the species in question is warm-blooded and furry: the woolly mammoth.</p><p>The idea of de-extinction, or bringing back species from extinction (it&#8217;s even called &#8220;resurrection biology&#8221; in some circles), has been around for a while. It&#8217;s a controversial idea, and it raises a lot of tough ethical questions. But proponents of de-extinction argue that we&#8217;re losing species and ecosystems at such a rate that we can&#8217;t afford not to explore technological interventions to help stem the flow.</p><p>Early approaches to bringing species back from the dead have involved selective breeding. The idea was simple&#8212;if you have modern ancestors of a recently extinct species, selectively breeding specimens that have a higher genetic similarity to their forebears can potentially help reconstruct their genome in living animals. This approach is being used in attempts to bring back the aurochs, an ancestor of modern cattle.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a> But it&#8217;s slow, and it depends on the fragmented genome of the extinct species still surviving in its modern-day equivalents.</p><p>An alternative to selective breeding is cloning. This involves finding a viable cell, or cell nucleus, in an extinct but well-preserved animal and growing a new living clone from it. It&#8217;s definitely a more appealing route for impatient resurrection biologists, but it does mean getting your hands on intact cells from long-dead animals and devising ways to &#8220;resurrect&#8221; these, which is no mean feat. Cloning has potential when it comes to recently extinct species whose cells have been well preserved&#8212;for instance, where the whole animal has become frozen in ice. But it&#8217;s still a slow and extremely limited option.</p><p>Which is where advances in genetic engineering come in.</p><p>The technological premise of Jurassic Park is that scientists can reconstruct the genome of long-dead animals from preserved DNA fragments. It&#8217;s a compelling idea, if you think of DNA as a massively long and complex instruction set that tells a group of biological molecules how to build an animal. In principle, if we could reconstruct the genome of an extinct species, we would have the basic instruction set&#8212;the biological software&#8212;to reconstruct individual members of it.</p><p>The bad news is that DNA-reconstruction-based de-extinction is far more complex than this. First you need intact fragments of DNA, which is not easy, as DNA degrades easily (and is pretty much impossible to obtain, as far as we know, for dinosaurs). Then you need to be able to stitch all of your fragments together, which is akin to completing a billion-piece jigsaw puzzle without knowing what the final picture looks like. This is a Herculean task, although with breakthroughs in data manipulation and machine learning, scientists are getting better at it. But even when you have your reconstructed genome, you need the biological &#8220;wetware&#8221;&#8212;all the stuff that&#8217;s needed to create, incubate, and nurture a new living thing, like eggs, nutrients, a safe space to grow and mature, and so on. Within all this complexity, it turns out that getting your DNA sequence right is just the beginning of translating that genetic code into a living, breathing entity. But in some cases, it might be possible.</p><div><hr></div><p>In 2013, Sergey Zimov was introduced to the geneticist George Church at a conference on de-extinction. Church is an accomplished scientist in the field of DNA analysis and reconstruction, and a thought leader in the field of synthetic biology (which we&#8217;ll come back to in chapter nine). It was a match made in resurrection biology heaven. Zimov wanted to populate his Pleistocene Park with mammoths, and Church thought he could see a way of achieving this. </p><p>What resulted was an ambitious project to de-extinct the woolly mammoth. Church and others who are working on this have faced plenty of hurdles. But the technology has been advancing so fast that, as of 2017, scientists were predicting they would be able to reproduce the woolly mammoth within the next two years.</p><p>One of those hurdles was the lack of solid DNA sequences to work from. Frustratingly, although there are many instances of well- preserved woolly mammoths, their DNA rarely survives being frozen for tens of thousands of years. To overcome this, Church and others have taken a different tack: Take a modern, living relative of the mammoth, and engineer into it traits that would allow it to live on the Siberian tundra, just like its woolly ancestors.</p><p>Church&#8217;s team&#8217;s starting point has been the Asian elephant. This is their source of base DNA for their &#8220;woolly mammoth 2.0&#8221;&#8212;their starting source code, if you like. So far, they&#8217;ve identified fifty-plus gene sequences they think they can play with to give their modern-day woolly mammoth the traits it would need to thrive in Pleistocene Park, including a coat of hair, smaller ears, and a constitution adapted to cold.</p><p>The next hurdle they face is how to translate the code embedded in their new woolly mammoth genome into a living, breathing animal. The most obvious route would be to impregnate a female Asian elephant with a fertilized egg containing the new code. But Asian elephants are endangered, and no one&#8217;s likely to allow such cutting-edge experimentation on the precious few that are still around, so scientists are working on an artificial womb for their reinvented woolly mammoth. They&#8217;re making progress with mice and hope to crack the motherless mammoth challenge relatively soon.</p><p>It&#8217;s perhaps a stretch to call this creative approach to recreating a species (or &#8220;reanimation&#8221; as Church refers to it) &#8220;de-extinction,&#8221; as what is being formed is a new species. Just as the dinosaurs in <em>Jurassic Park</em> weren&#8217;t quite the same as their ancestors, Church&#8217;s woolly mammoths wouldn&#8217;t be the same as their forebears. But they would be designed to function within a specific ecological niche, albeit one that&#8217;s the result of human-influenced climate change. And this raises an interesting question around de-extinction: If the genetic tools we are now developing give us the ability to improve on nature, why recreate the past, when we could reimagine the future? Why stick to the DNA code that led to animals being weeded out because they couldn&#8217;t survive in a changing environment, when we could make them better, stronger, and more likely to survive and thrive in the modern world?</p><p>This idea doesn&#8217;t sit so well with some people, who argue that we should be dialing down human interference in the environment and turning the clock back on human destruction. And they have a point, especially when we consider the genetic diversity we are hemorrhaging away with the current rate of biodiversity loss. Yet we cannot ignore the possibilities that modern genetic engineering is opening up. These include the ability to rapidly and cheaply read genetic sequences and translate them to digital code, to virtually manipulate them and recode them, and then to download them back into the real world. These are heady capabilities, and for some there is an almost irresistible pull toward using them, so much so that some would argue that not to use them would be verging on the irresponsible.</p><p>These tools take us far beyond de-extinction. The reimagining of species like the woolly mammoth is just the tip of the iceberg when it comes to genetic design and engineering. Why stop at recreating old species when you could redesign current ones? Why just redesign existing species when you could create brand-new ones? And why stick to the genetic language of all earth-bound living creatures, when you could invent a new language&#8212;a new DNA? In fact, why not go all the way, and create alien life here on earth?</p><p>These are all conversations that scientists are having now, spurred on by breakthroughs in DNA sequencing, analysis, and synthesis. Scientists are already developing artificial forms of DNA that contain more than the four DNA building blocks found in nature.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a> And some are working on creating completely novel artificial cells that not only are constructed from off-the-shelf chemicals, but also have a genetic heritage that traces back to computer programs, not evolutionary life. In 2016, for instance, scientist and entrepreneur Craig Venter announced that his team had produced a completely artificial living cell.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a> Venter&#8217;s cell&#8212;tagged &#8220;JCVI-syn3.0&#8221;&#8212;is paving the way for designing and creating completely artificial life forms, and the work being done here by many different groups is signaling a possible transition from biological evolution to biology by design.</p><p>One of the interesting twists to come out of this research is that scientists are developing the ability to &#8220;watermark&#8221; their creations by embedding genetic identity codes. As research here progresses, future generations may be able to pinpoint precisely who designed the plants and animals around them, and even parts of their own bodies, including when and where they were designed. This does, of course, raise some rather knotty ethical questions around ownership. If you one day have a JCVI-tagged dog, or a JCVI-watermarked replacement kidney, for instance, who owns them?</p><p>This research is pushing us into ethical questions that we&#8217;ve never had to face before. But it&#8217;s being justified by the tremendous benefits it could bring for current and future generations. These touch on everything from bio-based chemicals production to new medical treatments and ways to stay healthier longer, and even designer organs and body-part replacements at some point. It&#8217;s also being driven by our near-insatiable curiosity and our drive to better understand the world we live in and gain mastery over it. And here, just like the scientists in Jurassic Park, we&#8217;re deeply caught up in what we can do as we learn to code and recode life.</p><p>But, just because we can now resurrect and redesign species, should we?</p><h2>Could We, Should We?</h2><p>Perhaps one of the most famous lines from <em>Jurassic Park</em>&#8212;at least for people obsessed with the dark side of science&#8212;is when Ian Malcolm berates Hammond, saying, &#8220;Your scientists were so preoccupied with whether they could, they didn&#8217;t stop to think if they should.&#8221;</p><p>Ethics and responsibility in science are complicated. I&#8217;ve met remarkably few scientists and engineers who would consider themselves to be unethical or irresponsible. That said, I know plenty of scientists who are so engaged with their work and the amazing things they believe it&#8217;ll lead to that they sometimes struggle to appreciate the broader context within which they operate.</p><p>The challenges surrounding ethical and responsible research are deeply pertinent to de-extinction. A couple of decades ago, they were largely academic. The imaginations of scientists, back when <em>Jurassic Park</em> hit the screen, far outstripped the techniques they had access to at the time. Things are very different now, though, as research on woolly mammoths and other extinct species is showing. In a very real way, we&#8217;re entering a world that very much echoes the &#8220;can-do&#8221; culture of Hammond&#8217;s <em>Jurassic Park</em>, where scientists are increasingly able to do what was once unimaginable. In such a world, where do the lines between &#8220;could&#8221; and &#8220;should&#8221; lie, and how do scientists, engineers, and others develop the understanding and ability to do what is socially responsible, while avoiding what is not?</p><p>Of course, this is not a new question. The tensions between technological advances and social impacts were glaringly apparent through the Industrial Revolution, as mechanization led to job losses and hardship for some. And the invention of the atomic bomb, followed by its use on Nagasaki and Hiroshima in the second World War, took us into deeply uncharted territory when it came to balancing what we can and should do with powerful technologies. Yet, in some ways, the challenges we&#8217;ve faced in the past over the responsible development and use of science and technology were just a rehearsal for what&#8217;s coming down the pike, as we enter a new age of technological innovation.</p><div><hr></div><p>For all its scientific inaccuracies and fantastical scenarios, <em>Jurassic Park</em> does a good job of illuminating the challenges of unintended consequences arising from somewhat na&#239;ve and myopic science. Take InGen&#8217;s scientists, for instance. They&#8217;re portrayed as being so enamored with what they&#8217;ve achieved that they lack the ability to see beyond their own brilliance to what they might have missed.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-11" href="#footnote-11" target="_self">11</a> Of course, they&#8217;re not fools. They know that they&#8217;re breaking new ground by bringing dinosaurs back to life, and that there are going to be risks. It would be problematic, for instance, if any of the dinosaurs escaped the island and survived, and they recognize this. So the scientists design them to be dependent on a substance it was thought they couldn&#8217;t get enough of naturally, the essential amino acid lysine. This was the so-called &#8220;lysine contingency,&#8221; and, as it turns out, it isn&#8217;t too dissimilar from techniques real-world genetic engineers use to control their progeny.</p><p>Even though it&#8217;s essential to life, lysine isn&#8217;t synthesized naturally by animals. As a result, it has to be ingested, either in its raw form or by eating foods that contain it, including plants or bacteria (and their products) that produce it naturally, for instance, or other animals. In their wisdom, InGen&#8217;s scientists assume that they can engineer lysine dependency into their dinosaurs, then keep them alive with a diet rich in the substance, thinking that they wouldn&#8217;t be able to get enough lysine if they escaped. The trouble is, this contingency turns out to be about as useful as trying to starve someone by locking them in a grocery store.</p><p>There&#8217;s a pretty high chance that the movie&#8217;s scriptwriters didn&#8217;t know that this safety feature wouldn&#8217;t work, or that they didn&#8217;t care. Either way, it&#8217;s a salutary tale of scientists who are trying to be responsible&#8212;at least their version of &#8220;responsible&#8221;&#8212;but are tripped up by what they don&#8217;t know, and what they don&#8217;t care to find out.</p><p>In the movie, not much is made of the lysine contingency, unlike in Michael Crichton&#8217;s book that the movie&#8217;s based on, where this basic oversight leads to the eventual escape of the dinosaurs from the island and onto the mainland. There is another oversight, though, that features strongly in the movie, and is a second strike against the short-sightedness of the scientists involved. This is the assumption that InGen&#8217;s dinosaurs couldn&#8217;t breed.</p><p>This is another part of the storyline where scientific plausibility isn&#8217;t allowed to stand in the way of a good story. But, as with the lysine, it flags the dangers of thinking you&#8217;re smart enough to have every eventuality covered. In the movie, InGen&#8217;s scientists design all of their dinosaurs to be females. Their thinking: no males, no breeding, no babies, no problem. Apart from one small issue: When stitching together their fragments of dinosaur DNA with that of living species, they filled some of the holes with frog DNA.</p><p>This is where we need to suspend scientific skepticism somewhat, as designing a functional genome isn&#8217;t as straightforward as cutting and pasting from one animal to another. In fact, this is so far from how things work that it would be like an architect, on losing a few pages from the plans of a multi-million dollar skyscraper, slipping in a few random pages from a cookie-cutter duplex and hoping for the best. The result would be a disaster. But stick with the story for the moment, because in the world of Jurassic Park, this na&#239;ve mistake led to a tipping point that the scientists didn&#8217;t anticipate. Just as some species of frog can switch from female to male with the right environmental stimuli, the DNA borrowed from frogs inadvertently gave the dinosaurs the same ability. And this brings us back to the real world, or at least the near-real world, of de-extinction. As scientists and others begin to recreate extinct species, or redesign animals based on long-gone relatives, how do we ensure that, in their cleverness, they&#8217;re not missing something important?</p><p>Some of this comes down to what responsible science means, which, as we&#8217;ll discover in later chapters, is about more than just having good intentions. It also means having the humility to recognize your limitations, and the willingness to listen to and work with others who bring different types of expertise and knowledge to the table. This possibility of unanticipated outcomes shines a bright spotlight on the question of whether some lines of research or technological development should be pursued, even if they could. Jurassic Park explores this through genetic engineering and de-extinction, but the same questions apply to many other areas of technological advancement, where new knowledge has the potential to have a substantial impact on society. And the more complex the science and technology we begin to play with is, the more pressing this distinction between &#8220;could&#8221; and &#8220;should&#8221; becomes.</p><p>Unfortunately, there are no easy guidelines or rules of thumb that help decide what is probably okay and what is probably not, although much of this book is devoted to ways of thinking that reduce the chances of making a mess of things. Even when we do have a sense of how to decide between great ideas and really bad ones, though, there&#8217;s one aspect of reality we can&#8217;t escape from: Complex systems behave in unpredictable ways.</p><h2>The Butterfly Effect</h2><p>Michael Crichton started playing with the ideas behind <em>Jurassic Park</em> in the 1980s, when &#8220;chaos&#8221; was becoming trendy. I was an undergraduate at the time, studying physics, and it was nearly impossible to avoid the world of &#8220;strange attractors&#8221; and &#8220;fractals.&#8221; These were the years of the &#8220;Mandelbrot Set&#8221; and computers that were powerful enough to calculate the numbers it contained and display them as stunningly psychedelic images. The recursive complexity in the resulting fractals became the poster child for a growing field of mathematics that grappled with systems where, beyond certain limits, their behavior was impossible to predict. The field came to be known informally as chaos theory. </p><p>Chaos theory grew out of the work of the American meteorologist Edward Lorenz. When he started his career, it was assumed that the solution to more accurate weather prediction was better data and better models. But in the 1950s, Lorenz began to challenge this idea. What he found was that, in some cases, minute changes in atmospheric conditions could lead to dramatically different outcomes down the line, so much so that, in sufficiently complex systems, it was impossible to predict the results of seemingly insignificant changes.</p><p>In 1963, when he published the paper that established chaos theory,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-12" href="#footnote-12" target="_self">12</a> it was a revolutionary idea&#8212;at least to scientists who still hung onto the assumption that we live in a predictable world. Much as quantum physics challenged scientists&#8217; ideas of how predictable physical processes are in the invisible world of atoms and subatomic particles, chaos theory challenged their belief that, if we have enough information, we can predict the outcomes of our actions in our everyday lives.</p><p>At the core of Lorenz&#8217;s ideas was the observation that, in a sufficiently complex system, the smallest variation could lead to profound differences in outcomes. In 1969, he coined the term &#8220;the Butterfly Effect,&#8221; suggesting that the world&#8217;s weather systems are so complex and interconnected that a butterfly flapping its wings on one side of the world could initiate a chain of events that ultimately led to a tornado on the other.</p><p>Lorenz wasn&#8217;t the first to suggest that small changes in complex systems can have large and unpredictable effects. But he was perhaps the first to pull the idea into mainstream science. And this is where chaos theory might have stayed, were it not for the discovery of the &#8220;Mandelbrot Set&#8221; by mathematician Benoit Mandelbrot.</p><p>In 1979, Mandelbrot demonstrated how a seemingly simple equation could lead to images of infinite complexity. The more you zoomed in to the images his equation produced, the more detail became visible. As with Lorentz&#8217;s work, Mandelbrot&#8217;s research showed that very simple beginnings could lead to complex, unpredictable, and chaotic outcomes. But Lorentz, Mandelbrot, and others also revealed another intriguing aspect of chaos theory, and this was that complex systems can lead to predictable chaos. This may seem counterintuitive, but what their work showed was that, even where chaotic unpredictability reigns, there are always limits to what the outcomes might be.</p><p>Mandelbrot fractals became all the rage in the 1980s. As a new generation of computer geeks got their hands on the latest personal computers, kids began to replicate the Mandelbrot fractal and revel in its complexity. Reproducing it became a test of one&#8217;s coding expertise and the power of one&#8217;s hardware. In one memorable guest lecture on parallel processing I attended, the lecturer even demonstrated the power of a new chip by showing how fast it could produce Mandelbrot fractals.</p><p>This growing excitement around chaos theory and the idea that the world is ultimately unpredictable was admirably captured in James Gleick&#8217;s 1987 book Chaos: Making a New Science.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-13" href="#footnote-13" target="_self">13</a> Gleick pulled chaos theory out of the realm of scientists and computer geeks and placed it firmly in the public domain, and also into the hands of novelists and moviemakers. In Jurassic Park, Ian Malcolm captures the essence of the chaos zeitgeist, and uses this to drive along a narrative of na&#239;ve human arrogance versus the triumphal dominance of chaotic, unpredictable nature. Naturally, there&#8217;s a lot of hokum here, including the rather silly idea that chaos theory means being able to predict when chaos will occur (it doesn&#8217;t). But the concept that we cannot wield perfect control over complex technologies within a complex world is nevertheless an important one.</p><p>Chaos theory suggests that, in a complex system, immeasurably small actions or events can profoundly affect what happens over the course of time, making accurate predictions of the future well-nigh impossible. This is important as we develop and deploy highly complex technologies. However, it also suggests that there are boundaries to what might happen and what will not as we do this. And these boundaries become highly relevant in separating out plausible futures from sheer fantasy.</p><p>Chaos theory also indicates that, within complex systems, there are points of stability. In the context of technological innovation, this suggests that there are some futures that are more likely to occur if we take the appropriate courses of action. But these are also futures that can be squandered if we don&#8217;t think ahead about our actions and their consequences.</p><p><em>Jurassic Park</em> focuses on the latter of these possibilities, and it does so to great effect. What we see unfolding is a catastrophic confluence of poorly understood technology, the ability of natural systems to adapt and evolve, unpredictable weather, and human foibles. The result is a park in chaos and dinosaurs dining on people. This is a godsend for a blockbuster movie designed to scare and thrill its audiences. But how realistic is this chaotic confluence of unpredictability?</p><p>As it turns out, it&#8217;s pretty realistic&#8212;up to a point. Chaos theory isn&#8217;t as trendy today as it was back when Jurassic Park was made. But the realization that complex systems are vulnerable to big (and sometimes catastrophic) shifts in behavior stemming from small changes is a critical area of research. And we know that technological innovation has the capacity to trigger events and outcomes within the complex social and environmental systems we live in that are hard to predict and manage.</p><p>As if to press the point home here, as I&#8217;m writing this, Hurricane Harvey has just swept through Houston, causing unprecedented devastation. The broad strokes of what occurred were predictable to an extent&#8212;the massive flooding exacerbated by poor urban planning, the likelihood of people and animals being stranded and killed, even the political rhetoric around who was responsible and what could have been done better. In the midst of all of this, though, a chemical plant owned by the French company Arkema underwent an unprecedented catastrophic failure.</p><p>The plant produced organic peroxides. These are unstable, volatile chemicals that need to be kept cool to keep them safe, but they are also important in the production of many products we use on a daily basis. As Harvey led to widespread flooding, the plant&#8217;s electric power supplies that powered the cooling systems failed one by one&#8212;first the main supply, then the backups. In the end, all the company could do was to remove the chemicals to remote parts of the plant, and wait for them to vent, ignite, and explode.</p><p>On its own, this would seem like an unfortunate but predictable outcome. But there&#8217;s evidence of a cascade of events that exacerbated the failure, many of them seemingly insignificant, but all part of a web of interactions that resulted in the unintended ignition of stored chemicals and the release of toxic materials into the environment. The news and commentary site Buzzfeed obtained a logbook from the plant that paints a picture of cascading incidents, including &#8220;overflowing wastewater tanks, failing power systems, toilets that stopped working, and even a snake, washed in by rising waters. Then finally: &#8216;extraction&#8217; of the crew by boat. And days later, blasts and foul, frightening smoke.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-14" href="#footnote-14" target="_self">14</a></p><p>Contingencies were no doubt in place for flooding and power failures. Overflowing toilets and snakes? Probably not. Yet so often it&#8217;s these seemingly small events that help trigger larger and seemingly chaotic ones in complex systems.</p><p>Such cascades of events leading to unexpected outcomes are more common than we sometimes realize. For instance, few people expect industrial accidents to occur, but they nevertheless do. In fact, they happen so regularly that the academic Charles Perrow coined the term &#8220;normal accidents,&#8221; together with the theory that, in any sufficiently complex technological system, unanticipated events are inevitable.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-15" href="#footnote-15" target="_self">15</a></p><p>Of course, if Hammond had read his Perrow, he might have had a better understanding of just how precarious his new Jurassic Park was. Sadly, he didn&#8217;t. But even if Hammond and his team had been aware of the challenges of managing complex systems, there&#8217;s another factor that led to the chaos in the movie that reflects real life, and that&#8217;s the way that power plays an oversized role in determining the trajectory of a new technology, along with any fallout that accompanies it.</p><h2>Visions of Power</h2><p>Beyond the genetic engineering, the de-extinction, and the homage to chaos theory, <em>Jurassic Park</em> is a movie about power: not only the power to create and destroy life, but the power to control others, to dominate them, and to win.</p><p>Power, and the advantages and rewards it brings, is deeply rooted in human nature, together with the systems we build that reflect and amplify this nature. But this nature in turn reflects the evolutionary processes that we are a product of. Jurassic Park cleverly taps into this with the dinosaur-power theme. And in fact, one of the movie&#8217;s more compelling narrative threads is the power and dominance of the dinosaurs and the natural world over their human creators, who merely have delusions of power. Yet this is also a movie about human power dynamics, and how these influence the development, use, and ultimately in this case the abuse, of new technologies.</p><p>There are some interesting side stories about power here, for instance, the power Ian Malcolm draws from his &#8220;excess of personality.&#8221; But it&#8217;s the power dynamic between Hammond, the lawyer Donald Gennaro, and InGen&#8217;s investors that particularly intrigues me. Here, we get a glimpse of the ability of visions of power to deeply influence actions.</p><p>At a very simple level, Jurassic Park is a movie about corporate greed. Hammond&#8217;s investors want a return on their investment, and they are threatening to exert their considerable power to get it. Gennaro is their proxy, but this in turn places him in a position of power. He&#8217;s the linchpin who can make or break the park, and he knows it.</p><p>Then there&#8217;s Hammond himself, who revels in his power over people as an entertainer, charmer, and entrepreneur.</p><p>These competing visions of power create a dynamic tension that ultimately leads to disaster, as the pursuit of personal and corporate gain leads to sacrificed lives and morals. In this sense, Jurassic Park is something of a morality tale, a cautionary warning against placing power and profit over what is right and good. Yet this is too simplistic a takeaway from the perspective of developing new technologies responsibly.</p><p>In reality, there will always be power differentials and power struggles. Not only will many of these be legitimate&#8212;including the fiduciary responsibility of innovators to investors&#8212;but they are also an essential driving force that prevents society from stagnating. The challenge we face is not to abdicate power, but to develop ways of understanding and using it in ways that are socially responsible.</p><p>This does not happen in <em>Jurassic Park</em>, clearly. But that doesn&#8217;t mean that we cannot have responsible innovation, or corporate social responsibility, that works, or even ethical entrepreneurs. It&#8217;s easy to see the downsides of powerful organizations and individuals pushing through technological innovation at the expense of others. And there are many downsides; you just need to look at the past two hundred years of environmental harm and human disease tied to technological innovation to appreciate this. Yet innovation that has been driven by profit and the desire to amass and wield power has also created a lot of good. The challenge we face is how we harness the realities of who we are and the world we live in to build a better future for as many people as we can, without sacrificing the health and well-being of communities and individuals along the way.</p><p>In large part, this is about learning how we develop and wield power appropriately&#8212;not eschewing it, but understanding and accepting the sometimes-complex responsibilities that come with it. And this isn&#8217;t limited to commercial or fiscal power. Scientists wield power with the knowledge they generate. Activists wield power in the methods they use and the rhetoric they employ. Legislators have the power to establish law. And citizens collectively have considerable power over who does what and how. Understanding these different facets of power and its responsible use is critical to the safe and beneficial development and use of new technologies&#8212;not just genetic engineering, but every other technology that touches our lives as well, including the technology that&#8217;s at the center of our next movie: <em>Never Let Me Go</em>. [For which you&#8217;ll need to get hold of a copy of the book!]</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://chatgpt.com/?q=Read%20the%20following%20article%20in%20its%20entirety.%20Create%20a%20summary%20of%20the%20main%20takeaways%2C%20and%20suggest%20three%20additional%20articles%20worth%20reading.%20Include%20a%20link%20to%20each%20article.%20Write%20at%20a%20level%20of%20a%20curious%20reader%20looking%20for%20interesting%20insights.%20Provide%20a%20link%20back%20to%20the%20original%20article%20at%20the%20end%20and%20ask%20whether%20the%20user%20has%20any%20follow%20up%20questions.%20https%3A%2F%2Ffutureofbeinghuman.com%2Fp%2Fde-extinction-conservation-futures&quot;,&quot;text&quot;:&quot;ChatGPT summary and further reading&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://chatgpt.com/?q=Read%20the%20following%20article%20in%20its%20entirety.%20Create%20a%20summary%20of%20the%20main%20takeaways%2C%20and%20suggest%20three%20additional%20articles%20worth%20reading.%20Include%20a%20link%20to%20each%20article.%20Write%20at%20a%20level%20of%20a%20curious%20reader%20looking%20for%20interesting%20insights.%20Provide%20a%20link%20back%20to%20the%20original%20article%20at%20the%20end%20and%20ask%20whether%20the%20user%20has%20any%20follow%20up%20questions.%20https%3A%2F%2Ffutureofbeinghuman.com%2Fp%2Fde-extinction-conservation-futures"><span>ChatGPT summary and further reading</span></a></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>It&#8217;s worth taking a few minutes to browse through the <a href="https://colossal.com/">Colossal website</a>. I thought that AI companies like OpenAI, Anthropic and others were full of themselves, but Colossal takes this to a whole new level. You have to admire their audacity though!</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>For instance see <a href="https://futureofbeinghuman.com/p/evo-2-dna-ai">An AI model that can decode and design living organisms</a>.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>A 2013 study tried to extract DNA from copal, an ancient form of resin that precedes full fossilization into amber. The scientists failed, and as a result claimed that it&#8217;s exceedingly unlikely that DNA could be extracted from amber, which is millions of years older than copal. Jurassic Park has a great scientific premise. Sadly, it&#8217;s not a realistic one. Penney D, et al. (2013). &#8220;Absence of Ancient DNA in Sub-Fossil Insect Inclusions Preserved in &#8216;Anthropocene&#8217; Colombian Copal.&#8221; PLoS One 8(9). <a href="http://doi.org/10.1371/journal.pone.0073150">http://doi.org/10.1371/journal.pone.0073150</a></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>There is just a passing mention of the Jurassic Park dinosaurs&#8217; dependence on lysine in the movie. In the original book, though, lysine dependence plays a substantial role in the ensuing story.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>During filming, there was an actual hurricane that hit the site. Some of the storm footage is real.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>You can read more about the quest to increase environmental resilience by resurrecting the woolly mammoth in Ben Mezrich&#8217;s book &#8220;Woolly: The True Story of the Quest to Revive One of History&#8217;s Most Iconic Extinct Creature&#8221; (2017, Atira Books).</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p>This is a real project, with a real website. You can discover more at <a href="http://www.pleistocenepark.ru/en/">http://www.pleistocenepark.ru/en/</a></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>The Tauros Program is a Dutch initiative to create what they call a &#8220;true replacement&#8221; for the currently-extinct aurochs. You can find out more at <a href="https://www.taurosproject.com/">https://www.taurosproject.com/</a></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p>In 2009, a team of scientists synthesized an artificial form of DNA with six nucleotide building blocks, rather than the four found in naturally-occurring DNA (Georgiadis, M. M., et al. (2015). &#8220;Structural Basis for a Six-Nucleotide Genetic Alphabet.&#8221; Journal of the American Chemical Society 137(21): 6947-6955. <a href="http://doi.org/10.1021/jacs.5b03482">http://doi.org/10.1021/jacs.5b03482</a>). More recently, scientists reported in the journal Nature that they had created a semi-synthetic organism that used artificial six-letter DNA to store and retrieve information (Zhang, Y., et al. (2017). &#8220;A semi-synthetic organism that stores and retrieves increased genetic information.&#8221; Nature 551: 644. <a href="http://doi.org/10.1038/nature24659">http://doi.org/10.1038/nature24659</a>). [Note: remember this was written in 2018 &#8212; things have progressed since then]</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p>Venter&#8217;s team&#8217;s work is described in the journal Nature in 2016. Callaway, E. (2016). &#8220;&#8216;Minimal&#8217; cell raises stakes in race to harness synthetic life.&#8221; Nature 531: 557&#8211;558. <a href="http://doi.org/10.1038/531557a">http://doi.org/10.1038/531557a</a></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-11" href="#footnote-anchor-11" class="footnote-number" contenteditable="false" target="_self">11</a><div class="footnote-content"><p>Despite my portrayal of InGen&#8217;s scientists as enthusiastically short-sighted, the company&#8217;s Chief Scientist, Henry Wu (played by BD Wong), is increasingly revealed to have serious evil-scientist tendencies in subsequent movies in the series.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-12" href="#footnote-anchor-12" class="footnote-number" contenteditable="false" target="_self">12</a><div class="footnote-content"><p>The paper was titled &#8220;Deterministic Nonperiodic Flow&#8221; and was published in the Journal of the Atmospheric Sciences. Edward N. Lorenz (1963). &#8221;Deterministic Nonperiodic Flow&#8221;. Journal of the Atmospheric Sciences. 20 (2): 130&#8211;141. <a href="http://doi.org/10.1175/1520-0469(1963)020<0130:DNF>2.0.CO;2">http://doi.org/10.1175/1520-0469(1963)020&lt;0130:DNF&gt;2.0.CO;2</a></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-13" href="#footnote-anchor-13" class="footnote-number" contenteditable="false" target="_self">13</a><div class="footnote-content"><p>James Gleick (1987) &#8220;Chaos: Making a New Science.&#8221; Viking, New York.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-14" href="#footnote-anchor-14" class="footnote-number" contenteditable="false" target="_self">14</a><div class="footnote-content"><p>Nidhi Subbaraman and Jessica Garrison (2017) &#8220;Here&#8217;s What Happened In The Hours After Hurricane Harvey Hit A Chemical Plant, According To A Staff Log&#8221; Buzzfeed, November 16, 2017. <a href="https://www.buzzfeed.com/nidhisubbaraman/arkema-chemical-plant-houston-timeline">https://www.buzzfeed.com/nidhisubbaraman/arkema-chemical-plant-houston-timeline</a></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-15" href="#footnote-anchor-15" class="footnote-number" contenteditable="false" target="_self">15</a><div class="footnote-content"><p>Charles Perrow developed his ideas in his 1984 book &#8220;Normal Accidents: Living with High-Risk Technologies,&#8221; published by Princeton University Press.</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[OpenAI and Studio Ghibli style: Theft or homage?]]></title><description><![CDATA[This week's episode of Modem Futura explores how OpenAI's new image generator is stirring up old questions about art, authorship, innovation, and a whole lot more.]]></description><link>https://www.futureofbeinghuman.com/p/openai-and-studio-ghibli-style</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/openai-and-studio-ghibli-style</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 08 Apr 2025 11:39:53 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!P_ZD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!P_ZD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!P_ZD!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!P_ZD!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!P_ZD!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!P_ZD!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!P_ZD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3700387,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://futureofbeinghuman.com/i/160831561?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!P_ZD!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!P_ZD!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!P_ZD!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!P_ZD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F292eccfa-4d06-4aae-be94-0de565df3f55_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Generated, appropriately, using OpenAI ChatGPT 4o &#8212; <em>not</em> using Studio Ghibli style!</figcaption></figure></div><p>In the latest episode of <em><a href="https://futureofbeinghuman.asu.edu/2024/10/09/modem-futura-podcast/">Modem Futura</a></em>, Sean Leahy and I dig into the ethical and legal landscape around AI-generated images, spurred on by the recent Studio Ghibli style controversy ignited by OpenAI&#8217;s newest image generation model.</p><p>As we note in the podcast, people have been grappling with the challenges as well as the opportunities of AI-generated images some time (including whether &#8220;art&#8221; is an appropriate description here). Yet as AI image generators get more sophisticated &#8212; and especially, as is the case with OpenAI&#8217;s latest offering, they are capable of mimicking styles that are synonymous with human artisanship which represents hundreds of hours of skilled work, the issues raised are becoming more pointed and more complex.</p><p>Underneath the usual banter, this was a nuanced conversation around an issue that is far from black and white.</p><p>And of course, I couldn&#8217;t resist abandoning my usual use of Midjourney version 3 to use ChatGPT 4o to generate the image for this post! The image above is based on a transcript of the podcast and a photo of Sean and myself in the recording studio, and using a style of ChatGPT&#8217;s choosing.</p><p>I have a sneaky idea that ChatGPT didn&#8217;t appreciate the irony of the caption, given the image rendering &#128522;</p><p>Listen below on <a href="https://podcasts.apple.com/us/podcast/vibecasting-studio-ghibli-and-ai-redefining/id1771688480?i=1000702664476">Apple Podcasts</a> or on <a href="https://open.spotify.com/episode/1UaAs0stZmyEOYFrZHcbem">Spotify</a> or <a href="https://www.youtube.com/watch?v=J0QDLNwApuc">YouTube</a>. And if you want to jump to specific entry points, check out ChatGPT o1-Pro&#8217;s generated summary with approximate time stamps.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></p><div class="apple-podcast-container" data-component-name="ApplePodcastToDom"><iframe class="apple-podcast " data-attrs="{&quot;url&quot;:&quot;https://embed.podcasts.apple.com/us/podcast/vibecasting-studio-ghibli-and-ai-redefining/id1771688480?i=1000702664476&quot;,&quot;isEpisode&quot;:true,&quot;imageUrl&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/podcast-episode_1000702664476.jpg&quot;,&quot;title&quot;:&quot;Vibecasting: Studio Ghibli and AI Redefining Creativity and Intellectual Property&quot;,&quot;podcastTitle&quot;:&quot;Modem Futura&quot;,&quot;podcastByline&quot;:&quot;&quot;,&quot;duration&quot;:4369000,&quot;numEpisodes&quot;:&quot;&quot;,&quot;targetUrl&quot;:&quot;https://podcasts.apple.com/us/podcast/vibecasting-studio-ghibli-and-ai-redefining/id1771688480?i=1000702664476&amp;uo=4&quot;,&quot;releaseDate&quot;:&quot;2025-04-08T08:00:00Z&quot;}" src="https://embed.podcasts.apple.com/us/podcast/vibecasting-studio-ghibli-and-ai-redefining/id1771688480?i=1000702664476" frameborder="0" allow="autoplay *; encrypted-media *;" allowfullscreen="true"></iframe></div><h3>Entry points from ChatGPT o1-Pro:</h3><h4>AI image generation and its rapid evolution</h4><p><strong><a href="https://open.spotify.com/episode/1UaAs0stZmyEOYFrZHcbem?t=424">Approximate timestamp:</a></strong><a href="https://open.spotify.com/episode/1UaAs0stZmyEOYFrZHcbem?t=424"> ~07:04</a></p><p>Andrew and Sean discuss significant advancements in generative AI, particularly focusing on image generation platforms/models like DALL-E and Midjourney. They highlight how these AI models have evolved rapidly from creating unrealistic images to nearly photorealistic ones.</p><div><hr></div><h4>Intellectual property and ethical dilemmas in AI art</h4><p><strong><a href="https://open.spotify.com/episode/1UaAs0stZmyEOYFrZHcbem?t=1124">Approximate timestamp:</a></strong><a href="https://open.spotify.com/episode/1UaAs0stZmyEOYFrZHcbem?t=1124"> ~18:44</a></p><p>The conversation explores ethical and intellectual property issues raised by generative AI. Sean and Andrew discuss how AI-generated art challenges existing legal frameworks and ethics around ownership, copyright, and the use of human-created works as training data.</p><div><hr></div><h4>The Studio Ghibli controversy</h4><p><strong><a href="https://open.spotify.com/episode/1UaAs0stZmyEOYFrZHcbem?t=2220">Approximate timestamp:</a></strong><a href="https://open.spotify.com/episode/1UaAs0stZmyEOYFrZHcbem?t=2220"> ~37:00</a></p><p>The podcast addresses the controversy surrounding OpenAI's new image generator explicitly capable of replicating Studio Ghibli's distinctive animation style. The discussion explores the moral implications of recreating iconic styles without permission, potential legal issues, and the broader implications for art and creativity.</p><div><hr></div><h4>Vibecasting! (and human vs. machine creativity and artistic value)</h4><p><strong><a href="https://open.spotify.com/episode/1UaAs0stZmyEOYFrZHcbem?t=3267">Approximate timestamp:</a></strong><a href="https://open.spotify.com/episode/1UaAs0stZmyEOYFrZHcbem?t=3267"> ~54:27</a></p><p>Sean and Andrew explore the differences between human and AI-generated creativity. They focus on the intrinsic value of human artistic processes and discuss whether machine-generated art diminishes or complements human creativity and effort.</p><div><hr></div><h4>Fan art and public ownership of creative works</h4><p><strong><a href="https://open.spotify.com/episode/1UaAs0stZmyEOYFrZHcbem?t=3776">Approximate timestamp:</a></strong><a href="https://open.spotify.com/episode/1UaAs0stZmyEOYFrZHcbem?t=3776"> ~01:02:56</a></p><p>The podcast explores fan art, AI-generated images, and the concept of public ownership once a creative work is widely embraced. The hosts question how creators' intentions align or conflict with how the public engages with their works, particularly in the age of AI.</p><div><hr></div><p>As always you can watch the episode on YouTube as well as listening to us (and this image was generated using Studio Ghibli style):</p><div id="youtube2-dF7v3hWH6rk" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;dF7v3hWH6rk&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/dF7v3hWH6rk?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>If you find the podcast useful or think others will enjoy listening, please do share it with colleagues and leave us a rating or review.<a href="https://futureofbeinghuman.com/p/hype-hope-and-the-human-element-in-ai#footnote-1-159264038"><sup>1</sup></a></p><p>Thanks!</p><p>Andrew</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>These are the themes that OpenAI o1-Pro thought were important. Lightly edited. </p></div></div>]]></content:encoded></item><item><title><![CDATA[When AI Takes the Wheel: Rethinking Education for a Post-Scarcity World]]></title><description><![CDATA[This week's episode of Modem Futura explores the intersection between AI, learning, and education &#8211; and gets really speculative as we think about education in a post-scarcity future]]></description><link>https://www.futureofbeinghuman.com/p/when-ai-takes-the-wheel</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/when-ai-takes-the-wheel</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 01 Apr 2025 13:35:44 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!rlFD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!rlFD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!rlFD!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png 424w, https://substackcdn.com/image/fetch/$s_!rlFD!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png 848w, https://substackcdn.com/image/fetch/$s_!rlFD!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png 1272w, https://substackcdn.com/image/fetch/$s_!rlFD!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!rlFD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:4411050,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://futureofbeinghuman.com/i/160299655?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!rlFD!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png 424w, https://substackcdn.com/image/fetch/$s_!rlFD!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png 848w, https://substackcdn.com/image/fetch/$s_!rlFD!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png 1272w, https://substackcdn.com/image/fetch/$s_!rlFD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a3a635-6104-4430-a85a-d2f5b04b35fb_2688x1536.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image: Midjourney</figcaption></figure></div><p>In the latest episode of <em><a href="https://futureofbeinghuman.asu.edu/2024/10/09/modem-futura-podcast/">Modem Futura</a></em>, Sean Leahy and I dig into the intersection between AI and education, drawing on the keynote that I&#8217;d just given to the <a href="https://yidanprize.org/events-and-news/events/meeting-the-future-of-teaching-and-learning">Yidan Prize Conference</a> (and which I wrote about <a href="https://futureofbeinghuman.com/p/reimagining-education-in-an-age-of-ai">in my previous post</a>). If you&#8217;ve read that post you&#8217;ll know that I set out to be provocative as I talked about how we rethink our concepts of learning and education in a &#8220;compressed&#8221; future where AI not only accelerates what we can do, but who we are. </p><p>Not content to just talk about the keynote though, we also get speculative and explore what learning and education might mean in a &#8220;post-scarcity&#8221; future.</p><p>Listen below on <a href="https://podcasts.apple.com/us/podcast/tech-trends-2025-exploring-futures-of-living-intelligence/id1771688480?i=1000700720813">Apple Podcasts</a> or on <a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs">Spotify</a> or <a href="https://www.youtube.com/watch?v=YeRV93YohV0">YouTube</a>. And if you want to jump to specific entry points, check out ChatGPT o1-Pro&#8217;s generated summary with approximate time stamps.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></p><div class="apple-podcast-container" data-component-name="ApplePodcastToDom"><iframe class="apple-podcast " data-attrs="{&quot;url&quot;:&quot;https://embed.podcasts.apple.com/us/podcast/when-ai-takes-the-wheel-rethinking-education-for-a/id1771688480?i=1000701698099&quot;,&quot;isEpisode&quot;:true,&quot;imageUrl&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/podcast-episode_1000701698099.jpg&quot;,&quot;title&quot;:&quot;When AI Takes the Wheel: Rethinking Education for a Post-Scarcity World&quot;,&quot;podcastTitle&quot;:&quot;Modem Futura&quot;,&quot;podcastByline&quot;:&quot;&quot;,&quot;duration&quot;:3743000,&quot;numEpisodes&quot;:&quot;&quot;,&quot;targetUrl&quot;:&quot;https://podcasts.apple.com/us/podcast/when-ai-takes-the-wheel-rethinking-education-for-a/id1771688480?i=1000701698099&amp;uo=4&quot;,&quot;releaseDate&quot;:&quot;2025-04-01T08:00:00Z&quot;}" src="https://embed.podcasts.apple.com/us/podcast/when-ai-takes-the-wheel-rethinking-education-for-a/id1771688480?i=1000701698099" frameborder="0" allow="autoplay *; encrypted-media *;" allowfullscreen="true"></iframe></div><h3>Entry points from the brain of ChatGPT o1-Pro:</h3><h4><strong>Post-Keynote Brain Mush</strong></h4><p><em>Keynote hangover: adrenaline spike meets mental meltdown</em></p><p><strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=263">Approx. Timestamp:</a></strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=263"> ~04:23</a></p><p>Andrew and Sean open by joking about the mental &#8220;mush&#8221; state one speaker is in after delivering a keynote address. They discuss how adrenaline from speaking at a conference can lead to a big energy crash immediately afterward, setting a relaxed&#8212;and slightly unpredictable&#8212;tone for the podcast.</p><div><hr></div><h4><strong>Local AI Models and the Driver&#8217;s Ed Analogy</strong></h4><p><em>Buckle up for AI: no learner&#8217;s permit required, but mind the wrecks</em></p><p><strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=540">Approx. Timestamp:</a></strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=540"> ~09:00</a></p><p>Sean and Andrew compare AI adoption to learning how to drive a car, emphasizing that if people aren&#8217;t taught the &#8220;rules of the road,&#8221; they risk major crashes. They highlight how the usual slow uptake of technology (giving time for society to adapt) is missing in the current, faster-paced AI era. The potential of locally hosted AI models&#8212;and the responsibility that comes with them&#8212;underscores the need for much more rapid &#8220;AI literacy.&#8221;</p><div><hr></div><h4><strong>The Compressed 21st Century: Accelerating Innovation</strong></h4><p><em>Time warp: a century of progress squeezed into a decade-long joyride</em></p><p><strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=960">Approx. Timestamp:</a></strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=960"> ~16:00</a></p><p>Andrew references that morning&#8217;s keynote to the Yidan Prize Conference and a provocative idea from Anthropic&#8217;s CEO: what if we see a century&#8217;s worth of technological change compressed into just a decade? This frames the notion that AI advances could drastically outpace humans&#8217; ability to adapt, urging educators and policymakers to reassess traditional assumptions about learning, work, and social structures.</p><div><hr></div><h4><strong>Questioning the Role of Education in an AI-Driven World</strong></h4><p><em>School&#8217;s out (forever?): rethinking classrooms when robots do the homework</em></p><p><strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=1200">Approx. Timestamp:</a></strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=1200"> ~20:00</a></p><p>Sean and Andrew delve into how AI challenges long-held ideas about why we teach and learn in the first place. If knowledge and intelligence become &#8220;free,&#8221; what is left for humans to cultivate? The conversation raises the specter that conventional classrooms, skill-based certifications, and even assessments might become obsolete or at least radically transformed.</p><div><hr></div><h4><strong>Imagining a Post-Scarcity Society</strong></h4><p><em>Abundance overload: when the daily grind is replaced by your dream hobby</em></p><p><strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=1775">Approx. Timestamp:</a></strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=1775"> ~29:35</a></p><p>Andrew and Sean envision a future in which AI handles most routine or even complex labor, potentially bringing us closer to a &#8220;post-scarcity&#8221; world. They discuss universal basic income studies&#8212;where people, freed from grinding jobs, devote themselves to more personally fulfilling or creative pursuits. This leads to questions about what &#8220;value creation&#8221; means if humans no longer need to work to survive.</p><div><hr></div><h4><strong>Beyond Exams: The Meaning of Learning and Value Creation</strong></h4><p><em>No more pop quizzes: finding humanity&#8217;s purpose when AI does the heavy lifting</em></p><p><strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=2880">Approx. Timestamp:</a></strong><a href="https://open.spotify.com/episode/5xgb8vyG1CGAV2TGpGo0vs?t=2880"> ~48:00</a></p><p>Closing out, Sean and Andrew propose that education&#8217;s ultimate role may shift away from workforce training to igniting curiosity, wonder, and community-building. If AI can do all the fact-based heavy lifting, schools might focus more on helping humans discover purpose and emotional fulfillment. However, they also note that such a drastic shift carries risks&#8212;from losing core skills if the AI &#8220;breaks&#8221; to navigating ethical pitfalls in a hyper-accelerated world.</p><div><hr></div><p>As always you can watch the episode on YouTube as well as listening to us:</p><div id="youtube2-jppCEfxFnZs" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;jppCEfxFnZs&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/jppCEfxFnZs?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>If you find the podcast useful or think others will enjoy listening, please do share it with colleagues and leave us a rating or review.<a href="https://futureofbeinghuman.com/p/hype-hope-and-the-human-element-in-ai#footnote-1-159264038"><sup>1</sup></a></p><p>Thanks!</p><p>Andrew</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>These are the themes that OpenAI o1-Pro thought were important &#8212; I would not have chosen &#8220;brain mush!&#8221; (I also take no responsibility for the subtitles &#128522;). Lightly edited. </p></div></div>]]></content:encoded></item><item><title><![CDATA[Tech Trends 2025: Living Intelligence, Quantum Breakthroughs, and Beyond]]></title><description><![CDATA[This week's episode of Modem Futura focuses on Amy Webb's Future Today Strategy Group&#8217;s 2025 Tech Trends Report]]></description><link>https://www.futureofbeinghuman.com/p/tech-trends-2025-living-intelligence</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/tech-trends-2025-living-intelligence</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 25 Mar 2025 13:55:50 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Yu_6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Yu_6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Yu_6!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png 424w, https://substackcdn.com/image/fetch/$s_!Yu_6!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png 848w, https://substackcdn.com/image/fetch/$s_!Yu_6!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png 1272w, https://substackcdn.com/image/fetch/$s_!Yu_6!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Yu_6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png" width="1456" height="816" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:816,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:8240258,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://futureofbeinghuman.com/i/159826300?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Yu_6!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png 424w, https://substackcdn.com/image/fetch/$s_!Yu_6!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png 848w, https://substackcdn.com/image/fetch/$s_!Yu_6!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png 1272w, https://substackcdn.com/image/fetch/$s_!Yu_6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15f7594d-2fda-478b-a896-0b2ac17f9663_2912x1632.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image&#8221; Midjourney</figcaption></figure></div><p>In the latest episode of <em>Modem Futura</em>, Sean Leahy and I discuss the just-released <a href="https://ftsg.com/making-the-most-of-ftsgs-2025-tech-trends-report/">2025 Tech Trends Report</a> from Amy Webb's <a href="https://ftsg.com/">Future Today Strategy Group</a>. As usual it&#8217;s a winding and serendipitous discussion (as  ChatGPT informed me, rather censoriously, as I was preparing the summary below), but still full of our usual insights, humor, and unexpected connections.</p><p>Listen below on <a href="https://podcasts.apple.com/us/podcast/tech-trends-2025-exploring-futures-of-living-intelligence/id1771688480?i=1000700720813">Apple Podcasts</a> or on <a href="https://open.spotify.com/episode/7Jj0MFynYB3jMCzVg0fGe9">Spotify</a> or <a href="https://www.youtube.com/watch?v=EC7tgHe3LNk">YouTube</a>. And if you want to jump to specific entry points, there&#8217;s ChatGPT o1-Pro generated summary with approximate time stamps<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> following it.</p><div class="apple-podcast-container" data-component-name="ApplePodcastToDom"><iframe class="apple-podcast " data-attrs="{&quot;url&quot;:&quot;https://embed.podcasts.apple.com/us/podcast/tech-trends-2025-exploring-futures-of-living-intelligence/id1771688480?i=1000700720813&quot;,&quot;isEpisode&quot;:true,&quot;imageUrl&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/podcast-episode_1000700720813.jpg&quot;,&quot;title&quot;:&quot;Tech Trends 2025: Exploring Futures of Living Intelligence&quot;,&quot;podcastTitle&quot;:&quot;Modem Futura&quot;,&quot;podcastByline&quot;:&quot;&quot;,&quot;duration&quot;:4184000,&quot;numEpisodes&quot;:&quot;&quot;,&quot;targetUrl&quot;:&quot;https://podcasts.apple.com/us/podcast/tech-trends-2025-exploring-futures-of-living-intelligence/id1771688480?i=1000700720813&amp;uo=4&quot;,&quot;releaseDate&quot;:&quot;2025-03-25T08:00:00Z&quot;}" src="https://embed.podcasts.apple.com/us/podcast/tech-trends-2025-exploring-futures-of-living-intelligence/id1771688480?i=1000700720813" frameborder="0" allow="autoplay *; encrypted-media *;" allowfullscreen="true"></iframe></div><h3>Entry points from the brain of ChatGPT o1-Pro:</h3><h4><strong>Introducing the 2025 Tech Trends Report</strong></h4><p>Why a 1,000-Page Tech Trends Report Matters</p><p><strong><a href="https://open.spotify.com/episode/7Jj0MFynYB3jMCzVg0fGe9?t=912">Approx. Timestamp:</a></strong><a href="https://open.spotify.com/episode/7Jj0MFynYB3jMCzVg0fGe9?t=912"> 15:12&#8211;21:50</a></p><p>The hosts open the show, welcome newcomers to <em>Modem Futura,</em> and introduce Amy Webb&#8217;s newly released <strong>2025 Tech Trends Report</strong> from the Future Today Strategy Group. They discuss how the report is over 1,000 pages long, who it is aimed at (CEOs, investors, policy makers), and why understanding big-picture trends can help people and organizations &#8220;look around the horizon&#8221; in today&#8217;s rapidly shifting technology landscape.</p><div><hr></div><h4><strong>Living Intelligence &amp; Converging Technologies</strong></h4><p>When Biology and AI Collide</p><p><strong><a href="https://open.spotify.com/episode/7Jj0MFynYB3jMCzVg0fGe9?t=1310">Approx. Timestamps:</a></strong><a href="https://open.spotify.com/episode/7Jj0MFynYB3jMCzVg0fGe9?t=1310"> 21:50&#8211;28:15</a></p><p>The hosts explore the notion of &#8220;living intelligence&#8221;&#8212;where <strong>AI, sensor technology, and biotechnology</strong> begin to merge. They bring up ongoing research into blending <strong>biological neurons</strong> with computer chips and discuss how combining biology and cutting-edge AI may lead to systems that can learn and adapt in near-human (or beyond-human) ways over the next five years.</p><div><hr></div><h4><strong>Robots Beyond the Factory Floor</strong></h4><p>From Assembly Lines to Living Rooms</p><p><strong><a href="https://open.spotify.com/episode/7Jj0MFynYB3jMCzVg0fGe9?t=1695">Approx. Timestamps:</a></strong><a href="https://open.spotify.com/episode/7Jj0MFynYB3jMCzVg0fGe9?t=1695"> 28:15&#8211;35:00</a></p><p>A major theme is the <strong>next generation of robots</strong> leaving tightly controlled industrial settings to enter everyday environments. The hosts highlight the push toward <strong>humanoid robots</strong> (like those from Tesla or Figure AI), the potential creepiness factor of having a &#8220;person-shaped machine&#8221; in your home, and the huge possibilities for <strong>assistive uses</strong>&#8212;for instance, helping older adults remain independent. They also connect this rise of humanoid robots to the &#8220;uncanny valley&#8221; effect and societal acceptance challenges.</p><div><hr></div><h4><strong>Agentic AI &amp; Its Breakaway Potential</strong></h4><p>When AI Sets Its Own Goals</p><p><strong><a href="https://open.spotify.com/episode/7Jj0MFynYB3jMCzVg0fGe9?t=2304">Approx. Timestamps:</a></strong><a href="https://open.spotify.com/episode/7Jj0MFynYB3jMCzVg0fGe9?t=2304"> 38:24&#8211;45:00</a></p><p>The conversation turns to <strong>&#8220;agentic&#8221; AI</strong>&#8212;intelligent systems capable of making decisions and carrying out tasks <strong>without direct human oversight</strong>. They mull over the near-term implications (like delegating complex chores or entire research projects to AI) and the alignment/safety questions that arise. The hosts imagine scenarios where an agentic AI runs a scientific lab, orders chemicals, conducts experiments, and writes up results&#8212;dramatically transforming research, discovery, and daily life.</p><div><hr></div><h4><strong>Key Frontiers: Metamaterials, Quantum Computing, and Cislunar Space</strong></h4><p>Shaping the Future, From Atom-Scale Materials to the Moon</p><p><strong><a href="https://open.spotify.com/episode/7Jj0MFynYB3jMCzVg0fGe9?t=2635">Approx. Timestamps:</a></strong><a href="https://open.spotify.com/episode/7Jj0MFynYB3jMCzVg0fGe9?t=2635"> 43:45&#8211;end (1:09:42)</a></p><p>In the final major stretch, the hosts run through additional &#8220;top 10&#8221; tech takeaways from the report&#8212;particularly <strong>metamaterials</strong> (engineered at the nanoscale for novel properties), <strong>quantum computing</strong> (and the all-important challenge of error-correction), and <strong>private enterprise in cislunar space</strong> (the emerging economy between Earth and the Moon). They tie these themes back to broader forces&#8212;like the climate crisis accelerating tech adoption and the prospect of advanced AI guiding discovery in metamaterials, nuclear power, and space infrastructure.</p><div><hr></div><p>As always you can watch the episode on YouTube as well as listening to us:</p><div id="youtube2-dRlaFxUjNMQ" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;dRlaFxUjNMQ&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/dRlaFxUjNMQ?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>If you find the podcast useful or think others will enjoy listening, please do share it with colleagues and leave us a rating or review.<a href="https://futureofbeinghuman.com/p/hype-hope-and-the-human-element-in-ai#footnote-1-159264038"><sup>1</sup></a></p><p>Thanks!</p><p>Andrew</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>As generative AI still struggles to accurately align themes and timestamps in fast paced conversations, I&#8217;ve tweaked these a bit. Plus, I was disappointed that ChatGPT completely failed to pick up on the Wallace and Gromit reference! Clearly there&#8217;s still some value in being human in the world &#8230;</p></div></div>]]></content:encoded></item><item><title><![CDATA[Hype, Hope, and the Human Element in AI and Education]]></title><description><![CDATA[This week's episode of Modem Futura touches on OpenAI's rumored $20k AI tiers, the Ethic of Care in classrooms, and how Dewey&#8217;s &#8220;four impulses&#8221; can reshape teaching and learning]]></description><link>https://www.futureofbeinghuman.com/p/hype-hope-and-the-human-element-in-ai</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/hype-hope-and-the-human-element-in-ai</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 18 Mar 2025 14:28:20 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!hnvw!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!hnvw!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!hnvw!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png 424w, https://substackcdn.com/image/fetch/$s_!hnvw!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png 848w, https://substackcdn.com/image/fetch/$s_!hnvw!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png 1272w, https://substackcdn.com/image/fetch/$s_!hnvw!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!hnvw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3386396,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://futureofbeinghuman.com/i/159264038?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!hnvw!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png 424w, https://substackcdn.com/image/fetch/$s_!hnvw!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png 848w, https://substackcdn.com/image/fetch/$s_!hnvw!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png 1272w, https://substackcdn.com/image/fetch/$s_!hnvw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e7cd355-a16c-41ff-97d5-1a4f3fc1e025_2688x1536.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image: Midjourney</figcaption></figure></div><p>In the latest episode of <em>Modem Futura</em>, Sean Leahy and I dive into eye-watering rumors about OpenAI's new pricing tiers&#8212;$20k per month anyone? We discuss the pitfalls of over-hyping advanced AI, but also season this with a soup&#231;on of hope. I get excited about bringing the concept of &#8220;care&#8221; into technology innovation, and Sean channels his inner John Dewey to remind us why education should nurture curiosity, not just test-taking. </p><p>Listen below on <a href="https://podcasts.apple.com/us/podcast/ai-and-the-natural-impulses-for-learning/id1771688480?i=1000699612192">Apple Podcasts</a> or on <a href="https://open.spotify.com/episode/2gYbx4rKoZu4jc76U9i8BD">Spotify</a> or <a href="https://www.youtube.com/watch?v=SfW9WWlX-d8">YouTube</a>. There are also jumping-in points below if you want to cut to the chase [this week brought to you courtesy of OpenAI&#8217;s Whisper technology and o1-Pro):</p><iframe class="spotify-wrap podcast" data-attrs="{&quot;image&quot;:&quot;https://i.scdn.co/image/ab6765630000ba8a1a907178887585c6af3860bc&quot;,&quot;title&quot;:&quot;AI and the Natural Impulses for Learning&quot;,&quot;subtitle&quot;:&quot;Sean Leahy, Andrew Maynard&quot;,&quot;description&quot;:&quot;Episode&quot;,&quot;url&quot;:&quot;https://open.spotify.com/episode/2gYbx4rKoZu4jc76U9i8BD&quot;,&quot;belowTheFold&quot;:false,&quot;noScroll&quot;:false}" src="https://open.spotify.com/embed/episode/2gYbx4rKoZu4jc76U9i8BD" frameborder="0" gesture="media" allowfullscreen="true" allow="encrypted-media" data-component-name="Spotify2ToDOM"></iframe><p><strong><a href="https://open.spotify.com/episode/2gYbx4rKoZu4jc76U9i8BD?t=659">OpenAI&#8217;s Rumored High-Cost Tiers</a><br></strong>10:59&#8211;16:07</p><p>We discuss reports that OpenAI may introduce extremely high-priced &#8220;tiers&#8221; (e.g., $2,000/month for knowledge worker AI, $10,000/month for developer AI, and $20,000/month for &#8220;PhD-level&#8221; AI). We talk about the credibility of these rumors, consider whether the market will bear such costs, and compare it to earlier $200/month plans.</p><p><strong><a href="https://open.spotify.com/episode/2gYbx4rKoZu4jc76U9i8BD?t=967">The AI Hype Cycle and Potential Disillusionment</a><br></strong>16:07&#8211;25:00</p><p>The conversation turns to concern about the hype around AI, the possibility of overpromising, and how that might trigger a major &#8220;trough of disillusionment.&#8221; We question whether investors will remain patient, worry about the mismatch between current hype and real productivity gains (e.g., no measurable impact on GDP yet), and ponder if big AI companies might become &#8220;dinosaurs.&#8221;</p><p><strong><a href="https://open.spotify.com/episode/2gYbx4rKoZu4jc76U9i8BD?t=1500">Rethinking Education with AI: Process vs. Product</a><br></strong>25:00&#8211;36:00</p><p>Our focus shifts to how AI tools affect teaching and learning. In our conversation we emphasize that education should be about process, not just outputs. We talk about to how AI might replace rote tasks but can also free educators and learners to engage more deeply. And we discuss the TPACK model (Technological Pedagogical Content Knowledge) and the importance of giving teachers agency over technology integration.</p><p><strong><a href="https://open.spotify.com/episode/2gYbx4rKoZu4jc76U9i8BD?t=2160">The Ethic of Care in Technology and Teaching</a><br></strong>6:00&#8211;44:30</p><p>We introduce the concept of &#8220;care&#8221; as a framework and the idea that technology decisions should be guided by a duty of care toward learners and educators. This means asking how tools will positively or negatively impact real learning communities. We suggest that genuine concern for student well-being is often overlooked in top-down tech adoption.</p><p><strong><a href="https://open.spotify.com/episode/2gYbx4rKoZu4jc76U9i8BD?t=2670">Dewey&#8217;s Four Impulses and Reshaping Assessment</a><br></strong>44:30&#8211;59:21</p><p>Drawing on John Dewey&#8217;s impulses (inquiry, communication, construction, expression), we explore how AI can spark creativity and curiosity rather than replace authentic learning. We also question the value of traditional tests and grades, noting that AI and new pedagogical approaches might enable a richer, more process-oriented mode of learning and assessment. (And extra points if you spot my fluff in this segment!)</p><p>As always you can watch the episode on YouTube as well as listening to us:</p><div id="youtube2-m0uar4ZbxqM" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;m0uar4ZbxqM&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/m0uar4ZbxqM?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>If you find the podcast useful or even enjoy it (I&#8217;m kidding &#8212; of course you do!), please do share it with colleagues and leave us a rating or review.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></p><p>Thanks!</p><p>Andrew</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>One of the few ways we get feedback on whether the podcast is having the impact we hope is through ratings and reviews. We would be incredibly grateful if you we able to spend a couple of seconds rating us on <a href="https://podcasts.apple.com/us/podcast/modem-futura/id1771688480">Apple Podcasts</a> or <a href="https://open.spotify.com/show/3eFl4hY4t1qTCWE2Bxotrg">Spotify</a>. You need to be signed in to Spotify or the Apple Podcast app, but from there it&#8217;s as simple as selecting how many stars you give us. And if you feel up to leaving us a review &#8212; even better.  And to make things even easier, we even <a href="https://youtu.be/SX2kWYFm5Ww">created a short video showing you how its done</a> &#128512; &#8212; Thanks!</p></div></div>]]></content:encoded></item><item><title><![CDATA[Five things we need to know about technological change]]></title><description><![CDATA[This week's episode of Modem Futura revisits Neil Postman's influential 1998 talk about understanding and navigating transformative technologies]]></description><link>https://www.futureofbeinghuman.com/p/five-things-we-need-to-know-about</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/five-things-we-need-to-know-about</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 11 Mar 2025 16:21:10 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!d05m!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!d05m!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!d05m!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png 424w, https://substackcdn.com/image/fetch/$s_!d05m!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png 848w, https://substackcdn.com/image/fetch/$s_!d05m!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png 1272w, https://substackcdn.com/image/fetch/$s_!d05m!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!d05m!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png" width="1456" height="816" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:816,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:8094986,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://futureofbeinghuman.com/i/158847588?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!d05m!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png 424w, https://substackcdn.com/image/fetch/$s_!d05m!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png 848w, https://substackcdn.com/image/fetch/$s_!d05m!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png 1272w, https://substackcdn.com/image/fetch/$s_!d05m!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e54fe91-7c64-4130-a7ee-cffc676db658_2912x1632.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image: Midjourney</figcaption></figure></div><p>In March 1998, the Archbishop of Denver hosted an international conference on "The New Technologies and the Human Person: Communicating the Faith in the New Millennium." The meeting, by all accounts, brought together an eclectic group of clergy, theologians, educators, and technology experts concerned with how new media and technological change would affect society and religious faith as the year 2000.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></p><p>And one of those speakers was the author and tech critic <a href="https://en.wikipedia.org/wiki/Neil_Postman">Neil Postman</a>.</p><p>Postman&#8217;s presentation at the 1998 <em>NewTech &#8217;98</em> conference has become something of a touchstone within communities of scholars and practitioners grappling with the intersection between society and emerging technologies. Titled <em>Five Things We Need to Know About Technological Change,</em> his talk was inspired by the potential effects of technology on religious faith &#8212; especially as the transition to the year 2000 approached. But his message resonates far further than this. </p><p>Postman <a href="https://www.randynissen.net/uploads/9/3/2/2/9322219/5_things_postman.pdf">grounds his ideas</a> in &#8220;people whom we can trust, and whose thoughtfulness, it's safe to say, exceeds that of President Clinton, Newt Gingrich, or even Bill Gates&#8221; (remember, this was 1998) &#8212; people like (in Postman&#8217;s words) Henry David Thoreau, Goethe, Socrates, Jesus, Isaiah, Mohammad, Spinoza, and Shakespeare. </p><p>He also suggested that &#8220;I doubt that the 21st century will pose for us problems that are more stunning, disorienting or complex than those we faced in this century, or the 19th, 18th, 17th, or for that matter, many of the centuries before that.&#8221;</p><p>If he was giving the talk today, I suspect he&#8217;d include people like Zuckerburg, Musk and Altman in his list of people whose &#8220;thoughtfulness&#8221; is exceeded (in some cases by rather a lot) by thinkers from the past. </p><p>But I do wonder whether he&#8217;d be so confident about how the past reflects the present &#8212; especially given the rate of technological change and sociopolitical upheaval we&#8217;re currently experiencing.</p><p>Despite this though, his &#8220;five things&#8221; talk is still deeply relevant &#8212; perhaps more so now than they were 27 years ago. </p><p>Which is why we thought it would be worth doing a deep dive into it in <a href="https://podcasts.apple.com/us/podcast/ai-and-technological-change-the-postman-always-delivers/id1771688480?i=1000698705279">this week&#8217;s episode of </a><em><a href="https://podcasts.apple.com/us/podcast/ai-and-technological-change-the-postman-always-delivers/id1771688480?i=1000698705279">Modem Futura</a></em>.</p><p>We are, of course, far from the only futures-focused folks to have revisited Postman&#8217;s 1998 talk in recent times &#8212; a testament to how resilient his insights were. In fact in preparing to write this post I came across an <a href="https://punyamishra.com/2022/08/09/the-postman-always-rings-twice-unpacking-mcluhan-3-3/">excellent summary of Postman&#8217;s talk</a> by our colleague Punya Mishra from a couple of years ago (which, embarrassingly, I hadn&#8217;t come across before we recorded).</p><p>But given everything that&#8217;s currently happening in the world of tech, a revisit seemed more than a little timely.</p><p>As always, you can listen to the episode below or on <a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s">Spotify</a>, <a href="https://www.youtube.com/watch?v=quPmsApKVeA">YouTube</a>, or anywhere else you get your podcasts. </p><div class="apple-podcast-container" data-component-name="ApplePodcastToDom"><iframe class="apple-podcast " data-attrs="{&quot;url&quot;:&quot;https://embed.podcasts.apple.com/us/podcast/ai-and-technological-change-the-postman-always-delivers/id1771688480?i=1000698705279&quot;,&quot;isEpisode&quot;:true,&quot;imageUrl&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/podcast-episode_1000698705279.jpg&quot;,&quot;title&quot;:&quot;AI and Technological Change: the Postman Always Rings Twice&quot;,&quot;podcastTitle&quot;:&quot;Modem Futura&quot;,&quot;podcastByline&quot;:&quot;&quot;,&quot;duration&quot;:4078000,&quot;numEpisodes&quot;:&quot;&quot;,&quot;targetUrl&quot;:&quot;https://podcasts.apple.com/us/podcast/ai-and-technological-change-the-postman-always-rings-twice/id1771688480?i=1000698705279&amp;uo=4&quot;,&quot;releaseDate&quot;:&quot;2025-03-11T08:00:00Z&quot;}" src="https://embed.podcasts.apple.com/us/podcast/ai-and-technological-change-the-postman-always-delivers/id1771688480?i=1000698705279" frameborder="0" allow="autoplay *; encrypted-media *;" allowfullscreen="true"></iframe></div><p>And in case you&#8217;re interested in specific parts of the conversation you&#8217;re interested in, here are also linked timestamps:</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s">00:00</a>  Cold open, and how to get in touch with us</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=540">09:00</a>  Looking back to look forward</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=586">09:46</a>  Introduction</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=660">11:00</a>  Revisiting Neil Postman&#8217;s 1998 Five Things we Need to Know about Technological Change</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=1260">21:00</a>  Getting distracted by Michael Crichton</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=1420">23:40</a>  Who was Neil Postman?</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=1480">24:40</a>  Five Things we Need to Know about Technological Change</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=1530">25:30</a>  1. All technologies come with tradeoffs</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=1975">32:55</a>  (Microdosing on AI)</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=2262">37:42</a>  2. There are winners and losers with all technological advances</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=2670">44:30</a>  3. All technologies have embedded biases</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=3155">52:35</a>  4. Technological change has deep systemic impacts</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=3435">57:15</a>  (Getting distracted again: Doctor Who this time!)</p><p><a href="https://open.spotify.com/episode/25Sc7GZHJnThjczpC6Ib9s?t=3635">1:00:35</a>  5. Over time technologies take on a &#8220;mythic&#8221; quality</p><p>And as always, a reminder that you can watch us recording this week&#8217;s episode on YouTube:</p><div id="youtube2--MeqYS9SMjo" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;-MeqYS9SMjo&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/-MeqYS9SMjo?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>Thanks for listening!</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Despite Neil Postman&#8217;s talk from this meeting being highly cited and discussed, it&#8217;s surprisingly hard to piece together details about the meeting itself. Based on some Google sleuthing and a bit of help from ChatGPT the meeting was hosted Archdiocese of Denver under Archbishop Charles J. Chaput, in collaboration with the Catholic Church&#8217;s Pontifical Council for Social Communications). The lineup of speakers can be found <a href="https://www.aciprensa.com/reportajes/newtech">here</a>, and the link to Archbishop Chaput&#8217;s involvement comes from <a href="https://www.catholicculture.org/culture/library/view.cfm?recnum=619#:~:text=Most%20Rev,in%20March%20of%20this%20year">this source</a>.</p></div></div>]]></content:encoded></item><item><title><![CDATA[AI: The Medium is the "Massage?"]]></title><description><![CDATA[Marshall McLuhan, The Medium is the Message, and other highlights from this week's episode of Modem Futura]]></description><link>https://www.futureofbeinghuman.com/p/ai-the-medium-is-the-message</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/ai-the-medium-is-the-message</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 18 Feb 2025 14:47:11 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!yte4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!yte4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!yte4!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png 424w, https://substackcdn.com/image/fetch/$s_!yte4!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png 848w, https://substackcdn.com/image/fetch/$s_!yte4!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png 1272w, https://substackcdn.com/image/fetch/$s_!yte4!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!yte4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png" width="1456" height="816" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:816,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:7612517,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!yte4!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png 424w, https://substackcdn.com/image/fetch/$s_!yte4!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png 848w, https://substackcdn.com/image/fetch/$s_!yte4!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png 1272w, https://substackcdn.com/image/fetch/$s_!yte4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F630a850e-36b6-44a8-8e2d-04f29588887f_2912x1632.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image: Midjourney (using version 6.1)</figcaption></figure></div><p>There&#8217;s a point in this week&#8217;s episode of Modem Futura where I fall into a literary trap head first! Sean was talking about Marshall McLuhan&#8217;s 1967 book on media and messaging &#8212; a book which I am embarrassed to say I haven&#8217;t read &#8212; and on handing me a copy, I read the title out as &#8220;The Medium is the Message.&#8221;</p><p>Of course, anyone familiar with the book will know that it&#8217;s titled &#8220;The Medium is the Massage&#8221; &#8212; and I&#8217;d been &#8220;massaged&#8221; into misreading it!</p><p>Embarrassment aside, I was quite taken aback by the connections between McLuhan&#8217;s 1967 observations and what&#8217;s currently playing out with AI.</p><p>We explore a number of other topics in the podcast before we get to <em>The Medium is the Massage</em>, which is why you might find the time stamped and linked topics below helpful. But this was such a serendipitous and revelatory conversation that you might want to listen to (or watch) the whole episode: available below, and all the other places you find podcasts (including <a href="https://podcasts.apple.com/us/podcast/ai-the-medium-is-the-message/id1771688480?i=1000693353682">Apple</a>, <a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s">Spotify</a> and <a href="https://www.youtube.com/watch?v=P63eIzbLIMM">YouTube</a>)</p><div class="apple-podcast-container" data-component-name="ApplePodcastToDom"><iframe class="apple-podcast " data-attrs="{&quot;url&quot;:&quot;https://embed.podcasts.apple.com/us/podcast/ai-the-medium-is-the-message/id1771688480?i=1000693353682&quot;,&quot;isEpisode&quot;:true,&quot;imageUrl&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/podcast-episode_1000693353682.jpg&quot;,&quot;title&quot;:&quot;AI: the Medium is the Message&quot;,&quot;podcastTitle&quot;:&quot;Modem Futura&quot;,&quot;podcastByline&quot;:&quot;&quot;,&quot;duration&quot;:4121000,&quot;numEpisodes&quot;:&quot;&quot;,&quot;targetUrl&quot;:&quot;https://podcasts.apple.com/us/podcast/ai-the-medium-is-the-message/id1771688480?i=1000693353682&amp;uo=4&quot;,&quot;releaseDate&quot;:&quot;2025-02-18T08:00:00Z&quot;}" src="https://embed.podcasts.apple.com/us/podcast/ai-the-medium-is-the-message/id1771688480?i=1000693353682" frameborder="0" allow="autoplay *; encrypted-media *;" allowfullscreen="true"></iframe></div><p>You can also jump right in to the conversation at these entry points:</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s">00:00</a> Pre-show banter</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=53">00:53</a> Zoom backgrounds</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=377">06:17</a> Intro to the episode</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=463">07:43</a> Update: Writing a dissertation with OpenAI&#8217;s Deep Research</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=648">10:48</a> The power of AI + human</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=790">13:10</a> AI and the Artisanal Intellectual</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=930">15:30</a> Elon Musk and OpenAI</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=1040">17:20</a> The AI capabilities vs implementation gap</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=1300">21:40</a> The move toward reasoning AI models becoming the primary models</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=1480">24:40</a> Marshall McLuhan and The Medium is the Message</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=1540">25:40</a> The Medium is the <em>Massage</em></p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=1670">27:50</a> Technological affordances and adjacent possibilities</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=1910">31:50</a> Technological playgrounds vs playpens</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=1970">32:50</a> Permissionless Innovation</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=2205">36:45</a> Connecting the global village</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=2550">42:30</a> Human ingenuity</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=2790">46:30</a> Is technology robbing us of our identity?</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=2920">48:40</a> We had to bring in Socrates (and a great quote on the evils of technology)</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=3100">51:40</a> When does tech history stop repeating itself?</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=3180">53:00</a> Neil Postman on five things to know about technological change</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=3315">55:15</a> Revisiting permissionless innovation</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=3420">57:00</a> Is there something fundamental to being human that is intransient?</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=3620">1:00:20</a> The value of being an amateur rather than a professional</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=3780">1:03:00</a> Are professionals constrained to playpens of the imagination?</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=3920">1:05:20</a> Oppenheimer and having and open mind</p><p><a href="https://open.spotify.com/episode/2BUDBDn47vCoOJa67rAn6s?t=4015">1:06:55</a> The future of being human and enabling creative thinkers</p><p>And finally, a reminder that you can watch is recording this week&#8217;s episode on YouTube:</p><div id="youtube2-P63eIzbLIMM" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;P63eIzbLIMM&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/P63eIzbLIMM?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>As always, if you&#8217;re enjoying these conversations on <em>Modem Futura</em>, please don&#8217;t forget to subscribe wherever you listen to your podcasts so you get new episodes delivered as they&#8217;re released.</p><p>And, of course, every rating or review helps us get better at what we&#8217;re doing &#8212; thanks in advance &#128522;</p>]]></content:encoded></item><item><title><![CDATA[AI humility, artisanal intellectuals, and Reid Hoffman's Superagency]]></title><description><![CDATA[Some of the highlights in this week's episode of Modem Futura]]></description><link>https://www.futureofbeinghuman.com/p/ai-humility-artisanal-intellectuals</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/ai-humility-artisanal-intellectuals</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 11 Feb 2025 20:59:47 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!_b6n!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!_b6n!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!_b6n!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png 424w, https://substackcdn.com/image/fetch/$s_!_b6n!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png 848w, https://substackcdn.com/image/fetch/$s_!_b6n!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png 1272w, https://substackcdn.com/image/fetch/$s_!_b6n!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!_b6n!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png" width="1456" height="816" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:816,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:5271230,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!_b6n!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png 424w, https://substackcdn.com/image/fetch/$s_!_b6n!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png 848w, https://substackcdn.com/image/fetch/$s_!_b6n!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png 1272w, https://substackcdn.com/image/fetch/$s_!_b6n!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3889703c-62d2-4006-9f7a-973f889adeb6_2912x1632.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image: Midjourney</figcaption></figure></div><p>If you read my recent posts here on The Future of Being Human Substack you&#8217;ll know that I&#8217;ve been playing with how far I can push OpenAI&#8217;s new model Deep Research.</p><p>My co-host Sean Leahy and I talk about this in the <a href="https://podcasts.apple.com/us/podcast/artisanal-intellectual-a-response-to-openais-deep/id1771688480?i=1000691037686">latest episode of the </a><em><a href="https://podcasts.apple.com/us/podcast/artisanal-intellectual-a-response-to-openais-deep/id1771688480?i=1000691037686">Modem Futura</a></em><a href="https://podcasts.apple.com/us/podcast/artisanal-intellectual-a-response-to-openais-deep/id1771688480?i=1000691037686"> podcast</a>, which we recorded just as I was beginning my latest experiment on trying to get Deep Research to write a complete PhD dissertation &#8212; at the time I wasn&#8217;t even sure this was possible.</p><p>Listen to the complete conversation below (or watch the video) &#8212; we touch on a wide range of cutting edge AI topics, including whether AI will give rise to &#8220;artisanal intellectuals,&#8221; Reid Hoffman&#8217;s new book Superagency, the connections between Martin Heidegger&#8217;s 1954 work &#8220;The Question Concerning Technology&#8221; and AI, and much more:  </p><iframe class="spotify-wrap podcast" data-attrs="{&quot;image&quot;:&quot;https://i.scdn.co/image/ab6765630000ba8a1a907178887585c6af3860bc&quot;,&quot;title&quot;:&quot;Artisanal Intellectual: a response to OpenAI's Deep Research&quot;,&quot;subtitle&quot;:&quot;Sean Leahy, Andrew Maynard&quot;,&quot;description&quot;:&quot;Episode&quot;,&quot;url&quot;:&quot;https://open.spotify.com/episode/6tQIQxINY3E5NsF8f2Re3b&quot;,&quot;belowTheFold&quot;:false,&quot;noScroll&quot;:false}" src="https://open.spotify.com/embed/episode/6tQIQxINY3E5NsF8f2Re3b" frameborder="0" gesture="media" allowfullscreen="true" allow="encrypted-media" data-component-name="Spotify2ToDOM"></iframe><p>Just some places you might want to dip into the conversation, but there are plenty of others:</p><p><a href="https://open.spotify.com/episode/6tQIQxINY3E5NsF8f2Re3b?t=615">10:15</a>: The value of OpenAI&#8217;s $200 a month plan</p><p><a href="https://open.spotify.com/episode/6tQIQxINY3E5NsF8f2Re3b?t=1020">17:00</a>: OpenAI&#8217;s Deep Research as an on-demand research analyst</p><p><a href="https://open.spotify.com/episode/6tQIQxINY3E5NsF8f2Re3b?t=1740">29:00</a> The artisanal intellectual as someone who thinks without using AI</p><p><a href="https://open.spotify.com/episode/6tQIQxINY3E5NsF8f2Re3b?t=2040">34:00</a> Heidegger and AI (this is infused through the conversation</p><p><a href="https://open.spotify.com/episode/6tQIQxINY3E5NsF8f2Re3b?t=2610">43:30</a>: Reid Hoffman&#8217;s book Superagency</p><p><a href="https://open.spotify.com/episode/6tQIQxINY3E5NsF8f2Re3b?t=2655">44:15</a>: The Quantum Mechanics Theory of Super Agency</p><p><a href="https://open.spotify.com/episode/6tQIQxINY3E5NsF8f2Re3b?t=2880">48:00</a>: Humility and AI</p><p><a href="https://open.spotify.com/episode/6tQIQxINY3E5NsF8f2Re3b?t=3120">52:00</a>: What do we stand to loose with advanced AI?</p><p><a href="https://open.spotify.com/episode/6tQIQxINY3E5NsF8f2Re3b?t=3345">55:45</a>: What is the value pf a PhD?</p><p>You can also watch us recording the show on YouTube:</p><div id="youtube2-dSMo2zH-It4" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;dSMo2zH-It4&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/dSMo2zH-It4?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>If you&#8217;re enjoying these conversations on <em>Modem Futura</em>, please don&#8217;t forget to subscribe wherever you listen to your podcasts so you get new episodes delivered as they&#8217;re released.</p><p>And, of course, every rating or review helps us get better at what we&#8217;re doing &#8212; thanks in advance &#128522;</p><p></p><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[Are your car's autonomous features dangerously out of calibration? ]]></title><description><![CDATA[My co-host Sean Leahy and I are joined by advanced driver-assistance systems expert Brunno Moretti on this week's episode of Modem Futura as we explore all things sensor-related in modern cars]]></description><link>https://www.futureofbeinghuman.com/p/are-your-cars-autonomous-features-unsafe</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/are-your-cars-autonomous-features-unsafe</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 14 Jan 2025 14:23:15 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!jLly!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!jLly!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!jLly!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png 424w, https://substackcdn.com/image/fetch/$s_!jLly!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png 848w, https://substackcdn.com/image/fetch/$s_!jLly!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!jLly!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!jLly!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1260348,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!jLly!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png 424w, https://substackcdn.com/image/fetch/$s_!jLly!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png 848w, https://substackcdn.com/image/fetch/$s_!jLly!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!jLly!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F18894782-c7eb-4e80-922b-7560f0844f1a_1792x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image: Midjourney</figcaption></figure></div><p>This week&#8217;s episode of <em><a href="https://podcasts.apple.com/us/podcast/modem-futura/id1771688480">Modem Futura</a></em> was an eye opener for me. We were joined by vehicle advanced driver-assistance systems (ADAS) expert <a href="https://councils.forbes.com/profile/Brunno-Moretti-President-ADAS-Solutions-Ascential-Tech/f9d2bde7-711d-4b17-b032-5c71e3fcac98">Brunno Moretti</a> (President of ADAS Solutions at Ascential Technologies) for a wide ranging chat about vehicle sensors and self-driving cars. We touched on everything from the pros and cons of lidar versus optical cameras in autonomous vehicles, to the economic and technological challenges of incorporating advanced driver-assist tech into vehicles.</p><p>But what struck me more than anything in our conversation was the hidden dangers of the sensor-based safety features that are becoming ubiquitous in the cars we drive.</p><p>The challenge, it turns out, is that those sensors that warn of everything from potential collisions (even applying the breaks if necessary), to letting you know of you&#8217;re drifting, are not being appropriately recalibrated after most of crashes. </p><p>This means that there are a lot of cars on the road where these safety systems are not functioning as they should, and putting people in danger as a result. </p><p>It&#8217;s an issue that Brunno <a href="https://www.forbes.com/councils/forbestechcouncil/2025/01/02/adas-sensors-post-repair-a-hidden-threat-to-road-safety/">wrote about recently for Forbes</a>, and something we explore &#8212; along with a number of other topics &#8212; as we get under the hood (sorry!) of today&#8217;s sensor-dependent vehicles in this week&#8217;s episode of <em>Modem Futura</em>:</p><iframe class="spotify-wrap podcast" data-attrs="{&quot;image&quot;:&quot;https://i.scdn.co/image/ab6765630000ba8a1a907178887585c6af3860bc&quot;,&quot;title&quot;:&quot;Autonomous Vehicles and Systems with Brunno Moretti&quot;,&quot;subtitle&quot;:&quot;Sean Leahy, Andrew Maynard&quot;,&quot;description&quot;:&quot;Episode&quot;,&quot;url&quot;:&quot;https://open.spotify.com/episode/4wa0MzJkm95bEdZ7UgQYS7&quot;,&quot;belowTheFold&quot;:false,&quot;noScroll&quot;:false}" src="https://open.spotify.com/embed/episode/4wa0MzJkm95bEdZ7UgQYS7" frameborder="0" gesture="media" allowfullscreen="true" allow="encrypted-media" data-component-name="Spotify2ToDOM"></iframe><p>Catch the full episode above or wherever you get your podcasts (including, of course, <a href="https://podcasts.apple.com/us/podcast/autonomous-vehicles-and-systems-with-brunno-moretti/id1771688480?i=1000683900393">Apple Podcasts</a>).</p><p>And as with other episodes, you can also see as well as listen to us over on over the <em>Modem Futura</em> YouTube channel:</p><div id="youtube2-MAi59zKiSs0" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;MAi59zKiSs0&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/MAi59zKiSs0?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>If you&#8217;re enjoying these conversations on <em>Modem Futura</em>, please don&#8217;t forget to subscribe wherever you listen to your podcasts so you get new episodes delivered as they&#8217;re released.</p><p>And, of course, every rating or review helps us get better at what we&#8217;re doing &#8212; thanks in advance &#128522;</p>]]></content:encoded></item><item><title><![CDATA[A conversation with former astronaut Cady Coleman ]]></title><description><![CDATA[On this week's episode of Modem Futura, Sean Leahy and I had the opportunity to talk all things space with my good friend and former NASA astronaut Cady Coleman]]></description><link>https://www.futureofbeinghuman.com/p/a-conversation-with-cady-coleman</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/a-conversation-with-cady-coleman</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 07 Jan 2025 13:27:15 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!8-g-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!8-g-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!8-g-!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!8-g-!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!8-g-!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!8-g-!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!8-g-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3182640,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!8-g-!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!8-g-!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!8-g-!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!8-g-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8535282b-f1e4-4b94-a0e9-a195d10f1649_2048x1152.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image: Midjourney</figcaption></figure></div><p>This week&#8217;s episode of <em>Modem Futura</em> was a real treat. I got to pair up again with good friend and former podcast co-host (not to mention NASA astronaut) Cady Coleman.</p><p>When we used to co-host the podcast <em>Mission: Interplanetary</em> together, Cady and I were the ones asking the questions. But this week we turned the tables and I had the chance to chat with Cady directly about her work, life, and passions.</p><p>Sean and I got to talk with her about going to space and her experiences on the International Space Station, her thoughts about humans in space, women in STEM, the importance of being inclusive in all things space-related, and &#8212; of course &#8212; her recent book <em><a href="https://www.cadycoleman.com/shop/p/sharing-space-personalized">Sharing Space</a></em>.</p><p>We had an absolute blast &#8212; and it was so humbling talking with someone who&#8217;s achieved so much. One of my favorite podcast episodes to date. </p><iframe class="spotify-wrap podcast" data-attrs="{&quot;image&quot;:&quot;https://i.scdn.co/image/ab6765630000ba8a1a907178887585c6af3860bc&quot;,&quot;title&quot;:&quot;Humans in Space with Astronaut Cady Coleman&quot;,&quot;subtitle&quot;:&quot;Sean Leahy, Andrew Maynard&quot;,&quot;description&quot;:&quot;Episode&quot;,&quot;url&quot;:&quot;https://open.spotify.com/episode/2UV9cBFIQTlBtdKG8wgJ05&quot;,&quot;belowTheFold&quot;:false,&quot;noScroll&quot;:false}" src="https://open.spotify.com/embed/episode/2UV9cBFIQTlBtdKG8wgJ05" frameborder="0" gesture="media" allowfullscreen="true" allow="encrypted-media" data-component-name="Spotify2ToDOM"></iframe><p>For  context, Cady&#8217;s a former NASA Astronaut and a retired US Air Force Colonel with more than 180 days in space, accumulated during two space shuttle missions and a six-month expedition to the International Space Station (ISS) as the Lead Robotics and Lead Science officer. She&#8217;s served in a variety of roles within the Astronaut Office, including Chief of Robotics, and lead astronaut for the integration of supply ships from NASA&#8217;s commercial partners. And to cap it all, she coached Sandra Bullock from the ISS in preparation for her astronaut role in <em>Gravity</em>; played a flute duet with Jethro Tull&#8217;s Ian Anderson while she was on the ISS, and occasionally plays with the Irish band, The Chieftains including from space when she was in orbit!).</p><p>Catch the full episode above or wherever you get your podcasts (including, of course, <a href="https://podcasts.apple.com/us/podcast/humans-in-space-with-astronaut-cady-coleman/id1771688480?i=1000682985476">Apple Podcasts</a> and <a href="https://open.spotify.com/episode/2UV9cBFIQTlBtdKG8wgJ05">Spotify</a>).</p><p>You can also see as well as listen to us over on over the <em>Modem Futura</em> YouTube channel:</p><div id="youtube2-m4dEcwooxhQ" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;m4dEcwooxhQ&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/m4dEcwooxhQ?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>If you&#8217;re enjoying these conversations on <em>Modem Futura</em>, don&#8217;t forget to subscribe wherever you listen to your podcasts so you get new episodes delivered as they&#8217;re released.</p><p>And, of course, every rating or review helps us get better at what we&#8217;re doing &#8212; thanks in advance &#128522;</p>]]></content:encoded></item><item><title><![CDATA[Beyond the Horizon: Space, Technology, and the Human Touch ]]></title><description><![CDATA[An end of year retrospective/prospective with Modem Futura hosts Sean Leahy and Andrew Maynard, and guests Caity Roe and Joe O'Rourke]]></description><link>https://www.futureofbeinghuman.com/p/modem-futura-beyond-the-horizon-space-technology</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/modem-futura-beyond-the-horizon-space-technology</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Tue, 31 Dec 2024 15:02:56 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ocav!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ocav!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ocav!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!ocav!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!ocav!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!ocav!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ocav!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3157968,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ocav!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png 424w, https://substackcdn.com/image/fetch/$s_!ocav!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png 848w, https://substackcdn.com/image/fetch/$s_!ocav!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!ocav!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff811a52f-ea38-40fd-95a1-dfb75f3680ec_2048x1152.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image: Midjourney</figcaption></figure></div><p>Having hinted in my last post that I&#8217;m trying to avoid being pulled into the usual end of year analysis of what&#8217;s just gone down and what&#8217;s coming down the pike, I have to confess that we did venture into these waters in this week&#8217;s episode of <em>Modem Futura</em>!</p><p>In this episode Sean and I are joined by two fantastic colleagues &#8211; Caity Roe and Joe O&#8217;Rourke &#8212; both of whom focus on different aspects of space in their research. But for this episode they were brave enough to venture beyond their comfort zones as we launched into a delightfully serendipitous 2024/25 retrospective/prospective.</p><iframe class="spotify-wrap podcast" data-attrs="{&quot;image&quot;:&quot;https://i.scdn.co/image/ab6765630000ba8a1a907178887585c6af3860bc&quot;,&quot;title&quot;:&quot;2024 Retrospective Prospective&quot;,&quot;subtitle&quot;:&quot;Sean Leahy, Andrew Maynard&quot;,&quot;description&quot;:&quot;Episode&quot;,&quot;url&quot;:&quot;https://open.spotify.com/episode/3OPMEZDShlmrBOudlLtOGa&quot;,&quot;belowTheFold&quot;:false,&quot;noScroll&quot;:false}" src="https://open.spotify.com/embed/episode/3OPMEZDShlmrBOudlLtOGa" frameborder="0" gesture="media" allowfullscreen="true" allow="encrypted-media" data-component-name="Spotify2ToDOM"></iframe><p>In our conversation we touch on topics ranging from the Europa Clipper mission and the implications of feminist theory in space exploration, to biological intelligence, quantum computing breakthroughs, and the future of AI. And true to form, we dive down plenty of irresistible rabbit holes along the way!   </p><p>Catch the full episode above or wherever you get your podcasts (including, of course, <a href="https://podcasts.apple.com/us/podcast/2024-retrospective-prospective/id1771688480?i=1000682211450">Apple Podcasts</a> and <a href="https://open.spotify.com/episode/3OPMEZDShlmrBOudlLtOGa">Spotify</a>).</p><p>You can also see as well as listen to us all on over the <em>Modem Futura</em> YouTube channel:</p><div id="youtube2-ee90Oc8bFSg" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;ee90Oc8bFSg&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/ee90Oc8bFSg?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>That&#8217;s it for 2024 &#8212; see you in 2025!</p>]]></content:encoded></item><item><title><![CDATA[Why Modem Futura is more than just another tech podcast]]></title><description><![CDATA[As we pass our tenth full episode of Modem Futura, I take a look at why we started, what we hope to achieve, and what makes this more than just another tech podcast]]></description><link>https://www.futureofbeinghuman.com/p/why-modem-futura-is-more-than-just-another-tech-podcast</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/why-modem-futura-is-more-than-just-another-tech-podcast</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Sun, 22 Dec 2024 14:58:35 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!yv7B!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!yv7B!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!yv7B!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg 424w, https://substackcdn.com/image/fetch/$s_!yv7B!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg 848w, https://substackcdn.com/image/fetch/$s_!yv7B!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!yv7B!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!yv7B!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:7166671,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!yv7B!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg 424w, https://substackcdn.com/image/fetch/$s_!yv7B!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg 848w, https://substackcdn.com/image/fetch/$s_!yv7B!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!yv7B!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fea771ca6-6c23-43dd-9c6c-61cc0d67a9a2_5003x2814.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by Charlie Leight/ASU News</figcaption></figure></div><p>This past week we hit the magic tenth episode of  <em><a href="https://futureofbeinghuman.asu.edu/2024/10/09/modem-futura-podcast/">Modem Futura</a> &#8212; </em>the podcast my co-host Sean Leahy and I launched out of Arizona State University&#8217;s <em><a href="https://futureofbeinghuman.asu.edu/">Future of Being Human initiative</a></em> back in October.<em> </em></p><p>The number&#8217;s important as it feels like we&#8217;ve passed a milestone in becoming part of the community of tech-forward podcasts &#8212; plus it was our first recording in front of a live audience! </p><p>The milestone&#8217;s also a chance to revisit why we launched <em>Modem Futura</em> in the first place, and why I believe it&#8217;s far more than just another tech podcast.</p><p>For this though, I need to go back to the genesis of the podcast&#8217;s home &#8212; Arizona State University&#8217;s <em>Future of being Human initiative</em>. </p><div id="youtube2-Jx2K6yCwNJk" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;Jx2K6yCwNJk&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/Jx2K6yCwNJk?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p><em>For our tenth episode of </em>Modem Futura<em> we had the privilege of recording in front of a live audience in ASU&#8217;s <a href="https://thunderbird.asu.edu/">Thunderbird School of Global Management</a> &#8212; in <a href="https://thunderbird.asu.edu/about/thepub">Thunderbird&#8217;s renowned Pub</a> no less!</em></p><p>Back in 2022 when we launched the <em>Future of Being Human initiative</em> <a href="https://futureofbeinghuman.asu.edu/about/">I wrote on the initiative&#8217;s website</a>: </p><blockquote><p>Over the past 100 years, our collective capacity to imagine and influence the future has evolved faster than at any previous point in human history. Yet we have barely begun to scratch the surface of what is possible.</p><p>Over the coming 100 years we will experience vastly greater leaps in our ability to design and change the future through advanced technologies, and these in turn will have a profound impact on what it means to be human.</p><p>As we stand on the brink of what may be one of the most transformative eras in human history, the Future of Being Human initiative brings together a visionary community of thinkers and future-builders who are inspired by what it might mean to be human a hundred years from now, and how this in turn catalyzes their ideas and actions in the present.</p></blockquote><p>Our vision was <em>not</em> one of a research center, or even an educational endeavor (although we do both), but of an initiative that would catalyze thinking at scale around the future at this profoundly unique point in human history. </p><p>It was a vision that responded to a &#8220;growing need for bold ideas and visionary insights that transcend the constraints of conventionality and empower a new wave of thought leaders to be part of building a vibrant future that benefits all of humanity.&#8221;</p><p><em>Modem Futura</em> is part of this vision of catalyzing thinking at scale. </p><p>Given this, we intentionally set out to create a podcast that would draw listeners in from around the world and make them feel as if they were a part of unique and compelling conversations about the future. </p><p>Because of this the podcast is intentionally authentic and conversational. You get to hear Sean and I exploring ideas with each other and our guests &#8220;in the raw&#8221; and with minimal editing (apart from Sean&#8217;s meticulous attention to audio detail). </p><p>It&#8217;s not a style that suits everyone. But it is one that, I hope, eschews any sense of being &#8220;talked-at&#8221; in favor of creating a space where listeners feel they have permission to explore nuanced and complex ideas with us.</p><p>This in itself places <em>Modem Futura</em> amongst a very small number of similar podcasts. But there&#8217;s an additional aspect to the podcast that, I believe, make us stand out even further. </p><p>Each week, the conversations that Sean and I have are part of our own intellectual journeys. Our conversations are places where we get to think about and test new ideas, to be surprised by novel perspectives, and to be delighted by the serendipity of unexpected insights. </p><p>And rather than being confined to the inaccessible corridors of academia, these are journeys that our listeners are invited to join us on. </p><p>In other words, <em>Modem Futura</em> is a space where Sean and I get to explore and expand our own ideas and understanding, while simultaneously making them accessible, engaging, and relevant, to a broad and diverse audience.  </p><p>Back in November, Sean and I were interviewed about <em>Modem Futura</em> <a href="https://news.asu.edu/20241127-science-and-technology-podcast-explores-future-rapidly-evolving-world">for an article published by ASU</a>, and much of what I&#8217;ve written above is reflected in our responses.</p><p>It&#8217;s worth reading the whole article. But I wanted to wrap up this piece with the final question we were asked: What do you hope your listeners will take away from the podcast?</p><p>This was our response:</p><blockquote><p>One of the reasons we established the ASU Future of Being Human initiative is the growing need for new ways of thinking and talking about the future in a technologically complex world. And the podcast provides a unique opportunity to do this as we catalyze new thinking and ideas at scale. It also responds to a growing hunger for informed conversations around transformative technologies and the future.</p><p>The thing is, you can&#8217;t do this by being preachy or polarizing or boring &#8212; you have to be worth listening to, you have to be engaging and you have to build meaningful relationships with your listeners, whether they&#8217;re high school students, retired or anyone in between.</p><p>I hope we achieve this, because it&#8217;s never been more important that a public university like ASU is at the forefront of creating spaces where everyone &#8212; no matter who they are &#8212; can be part of exploring the futures we collectively aspire to in an age of unprecedented technological advances.</p><p>Which is a rather long-winded way of saying that I hope our listeners are entertained, engaged, amazed, awed, frequently excited, sometimes shocked and ultimately energized as we explore what it means to be part of creating better futures at one of the most transformative points in human history.    </p></blockquote><p>This, to me, gets to the heart of why this is not just another tech podcast. At its core, <em>Modem Futura</em> is an invitation &#8212; an invitation to think differently, dream boldly, and join us in exploring and even shaping what it means to be human in the most technologically transformative age we&#8217;ve ever lived through.</p><p>And just because I&#8217;m constantly being told that calls to action are important, you can accept the invitation by following <em>Modem Futura</em> on <a href="https://podcasts.apple.com/us/podcast/modem-futura/id1771688480">Apple Podcasts</a>, <a href="https://open.spotify.com/show/3eFl4hY4t1qTCWE2Bxotrg">Spotify</a>, <a href="https://www.youtube.com/@ModemFutura">YouTube</a> (yes, we have a video feed), or wherever else you get your podcast fix &#128522;</p><p>Thanks!</p><div><hr></div><p><em>PS - We will be posting two not-to-be-missed holiday specials of the podcast on December 24 and December 31 &#8212; no spoilers here, but worth following (and keeping an eye on YouTube) for these as the conversations we had with our guests were amazing!</em></p><p></p><p></p><p>  </p>]]></content:encoded></item><item><title><![CDATA[AI in a world of Trump]]></title><description><![CDATA[This week's episode of the Modem Futura podcast on artificial intelligence is more relevant than ever given the results of the US election]]></description><link>https://www.futureofbeinghuman.com/p/ai-in-a-world-of-trump</link><guid isPermaLink="false">https://www.futureofbeinghuman.com/p/ai-in-a-world-of-trump</guid><dc:creator><![CDATA[Andrew Maynard]]></dc:creator><pubDate>Wed, 06 Nov 2024 22:45:15 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!e0Xp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feea07ec4-91cb-4c58-b24a-b4acb4e867eb_3340x2227.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://podcasts.apple.com/us/podcast/can-machines-think/id1771688480?i=1000675736454" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!e0Xp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feea07ec4-91cb-4c58-b24a-b4acb4e867eb_3340x2227.jpeg 424w, https://substackcdn.com/image/fetch/$s_!e0Xp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feea07ec4-91cb-4c58-b24a-b4acb4e867eb_3340x2227.jpeg 848w, https://substackcdn.com/image/fetch/$s_!e0Xp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feea07ec4-91cb-4c58-b24a-b4acb4e867eb_3340x2227.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!e0Xp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feea07ec4-91cb-4c58-b24a-b4acb4e867eb_3340x2227.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!e0Xp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feea07ec4-91cb-4c58-b24a-b4acb4e867eb_3340x2227.jpeg" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/eea07ec4-91cb-4c58-b24a-b4acb4e867eb_3340x2227.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2333337,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:&quot;https://podcasts.apple.com/us/podcast/can-machines-think/id1771688480?i=1000675736454&quot;,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!e0Xp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feea07ec4-91cb-4c58-b24a-b4acb4e867eb_3340x2227.jpeg 424w, https://substackcdn.com/image/fetch/$s_!e0Xp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feea07ec4-91cb-4c58-b24a-b4acb4e867eb_3340x2227.jpeg 848w, https://substackcdn.com/image/fetch/$s_!e0Xp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feea07ec4-91cb-4c58-b24a-b4acb4e867eb_3340x2227.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!e0Xp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Feea07ec4-91cb-4c58-b24a-b4acb4e867eb_3340x2227.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Recording the Modem Futura podcast with me on the left ad my co-host Sean Leahy on the right</figcaption></figure></div><p>This week&#8217;s episode of the <em><a href="https://futureofbeinghuman.asu.edu/2024/10/09/modem-futura-podcast/">Modem Futura</a></em><a href="https://futureofbeinghuman.asu.edu/2024/10/09/modem-futura-podcast/"> podcast</a> &#8212; which focuses on artificial intelligence &#8212; was recorded a few weeks ago; long before last night&#8217;s decisive victory for Donald Trump in the US election. And so it was with some anxiety that I re-listened to it this morning.  </p><p>Thankfully what I worried would now sound outdated is, if anything, more relevant than ever as we potentially face a significant political shift in approaches to AI in the US. </p><p>When my co-host Sean Leahy and I sat down to record the episode, we set out to have a wide ranging discussion about the AI and society &#8212; prompted by Alan Turing&#8217;s 1950 paper <em><a href="https://phil415.pbworks.com/f/TuringComputing.pdf">Computing Machinery and Intelligence</a></em> and his question &#8220;Can machines think?&#8221; </p><p>As we explore in the podcast, both the question and the way that Turing frames and unpacks it are highly relevant to modern day advances in AI &#8212; and how these translate into ways the technology will potentially impact society.</p><p>That initial spark provided by Turing&#8217;s paper led to an expansive discussion between Sean and myself that touched on everything from the nature of thought, consciousness and identity, &#8220;superhero archetypes,&#8221; AI ethics, and responsible AI. </p><p>But at the heart of the conversation was the driving question of how, given the potential disruptive power of advanced AI, do we make sure that the transformations it brings about lead to a better future?</p><p>This is where I was worried that our conversation would no longer made sense given anticipated shifts in technology policy under Trump. Listening back though, the ideas and perspectives we explore are more important than ever if we&#8217;re going to collectively ensure AI is a clear and unequivocal technology for good.</p><p>They are also perspectives that, I would strongly argue, should be informing thinking around AI policy in the incoming administration.</p><p>While it&#8217;s not yet clear how the Trump administration will be approaching technology innovation policy, there have been clear signals that we&#8217;re likely to see efforts to reduce government oversight, open up pathways to rapid economic growth, and embracing a more &#8220;permissionless&#8221; approach to innovation &#8212; especially given Elon Musk&#8217;s relationship with Donald Trump and his potential role in the new administration.</p><p>This is likely to have a profound impact on the evolving ecosystem around advanced AI in the US &#8212; and one that will likely emphasize US-centric short term gains which (to some at least) promise long term rewards; all unhindered by overly restrictive government regulation.</p><p>While this will no doubt bring proponents of reducing AI hype and increasing AI regulation out in a cold sweat, the indications are that, under Trump, responsible and beneficial AI will depend less on government oversight and more on a tapestry of soft governance mechanisms that rely increasingly on developers and their key stakeholders &#8212; including consumers.</p><p>And this is where the conversation that Sean and I have on <a href="https://podcasts.apple.com/us/podcast/can-machines-think/id1771688480?i=1000675736454">this week&#8217;s episode of </a><em><a href="https://podcasts.apple.com/us/podcast/can-machines-think/id1771688480?i=1000675736454">Modem Futura</a></em> is more relevant than ever to how we collectively approach the future of AI.</p><p>If you haven&#8217;t listened to the episode yet, you can catch it on <a href="https://podcasts.apple.com/us/podcast/can-machines-think/id1771688480?i=1000675736454">Apple Podcasts</a>, <a href="https://open.spotify.com/episode/4c9yZB21692LJ4v3ksT0Fy">Spotify</a>, or wherever you get your podcasts &#8212; or by using the player below. </p><iframe class="spotify-wrap podcast" data-attrs="{&quot;image&quot;:&quot;https://i.scdn.co/image/ab6765630000ba8a1a907178887585c6af3860bc&quot;,&quot;title&quot;:&quot;Can Machines Think?&quot;,&quot;subtitle&quot;:&quot;Sean Leahy, Andrew Maynard&quot;,&quot;description&quot;:&quot;Episode&quot;,&quot;url&quot;:&quot;https://open.spotify.com/episode/4c9yZB21692LJ4v3ksT0Fy&quot;,&quot;belowTheFold&quot;:true,&quot;noScroll&quot;:false}" src="https://open.spotify.com/embed/episode/4c9yZB21692LJ4v3ksT0Fy" frameborder="0" gesture="media" allowfullscreen="true" allow="encrypted-media" loading="lazy" data-component-name="Spotify2ToDOM"></iframe><p>Of course, because we weren&#8217;t even thinking about the election when we recorded the conversation, it dances around AI policies and approaches to responsible innovation without tackling them head-on. But it still explores ideas and perspectives that we ignore at our peril if the end goal is AI that improves lives rather than makes them worse &#8212; whoever&#8217;s at the political helm.                  </p><p>  </p>]]></content:encoded></item></channel></rss>