State AI Regulation Faces Industry Pushback and Preemption Threats
TL;DR
- State-level AI regulation, like New York's proposed "RAISE Act," front-loads safety costs by requiring disclosure of safety plans and incidents, preventing potential future catastrophic failures and mirroring historical industry accountability models.
- The AI industry's significant financial investment in targeting politicians like Alex Bores highlights a strategic effort to shape regulatory landscapes, demonstrating the substantial economic stakes and lobbying power involved.
- AI's pervasive impact across political issues, from labor markets to wealth inequality and geopolitical competition with China, necessitates proactive policy responses that balance innovation with societal risks.
- The "RAISE Act" defines frontier models based on compute spending and computational operations, aiming to regulate only the most advanced AI research labs and mitigate risks from knowledge distillation techniques.
- While AI can offer immense benefits, the potential for misuse, such as bioweapons or sophisticated deepfakes, underscores the critical need for thoughtful policy and technological solutions like content provenance standards.
- Government effectiveness hinges on implementation and data-driven performance tracking, as exemplified by successful telemarketing fine increases and the identification of registration renewal issues for mopeds.
- The challenge of AI regulation is compounded by the potential for preemption by federal executive orders, creating tension between state-led safety initiatives and a push for a single national rule.
Deep Dive
The AI industry is actively engaging in political discourse, targeting policymakers who propose regulation with significant financial backing, aiming to shape legislation in its favor. This strategy is driven by the industry's perception of state-level regulations as an existential threat to its rapid growth, highlighting a tension between innovation and safety that is increasingly becoming a central theme in political debates.
Alex Bores, a New York State Assemblymember and candidate for Congress, has become a focal point of this industry pushback due to his sponsorship of the RAIS Act. This proposed legislation mandates that major AI frontier labs publicly disclose safety plans and critical incidents, and prohibits the release of models that fail internal safety tests, drawing parallels to historical industries that concealed product dangers. The industry's significant financial investment in opposing Bores, including a reported $100 million super PAC, underscores the perceived threat of such regulations. The RAIS Act's parameters, focusing on companies with substantial compute spending and model complexity, aim to apply scrutiny to the largest AI developers without unduly burdening smaller entities.
The implications of this industry engagement extend beyond legislative battles. The AI industry's political influence raises concerns about regulatory capture, where powerful entities may unduly sway policy in their favor, potentially stifling necessary safety measures. Furthermore, the focus on state-level regulation, as exemplified by New York's efforts, suggests a fragmented approach to AI governance that could create compliance challenges for companies and potentially disadvantage American firms if other nations adopt less stringent regulations. Bores' background as a former data scientist at Palantir provides him with a unique perspective, enabling him to articulate the technical nuances of AI regulation and counter industry arguments with informed reasoning. His emphasis on government's role in implementation and data-driven performance assessment, alongside his advocacy for practical consumer protections like "click to cancel" for subscriptions, suggests a broader vision for effective governance that extends beyond high-profile AI debates. The industry's targeting of Bores, despite his technical expertise, indicates a broader strategy to influence the political landscape and prevent regulations that could impact their profit-driven growth models, particularly in the context of US-China technological competition.
Action Items
- Audit AI frontier labs: Require public safety plans and critical incident disclosures for 5-7 major labs (ref: Raise Act).
- Implement content provenance standard: Mandate cryptographic proof (e.g., C2PA) for AI-generated media by default across 3-5 platforms.
- Track government program outcomes: Measure effectiveness of 2-3 legislative initiatives against pre-defined metrics and adjust based on data.
- Develop AI education curriculum: Design modular training for 10-15 government roles on AI capabilities and ethical considerations.
- Standardize crypto regulations: Codify existing guidance into statute for 3-5 key areas to maintain New York's regulatory framework.
Key Quotes
"The politics of AI are already exploding. Whether we're talking about data centers, electricity prices, labor displacement, water consumption, competition with China, or users of chatbots becoming psychotically obsessed, AI is already a major topic in elections."
The hosts, Joe Weisenthal and Tracy Alloway, introduce the pervasive influence of AI on political discourse. They highlight that AI is not just a future concern but a present reality impacting numerous facets of society that are inherently political.
"The argument that you hear from AI proponents is that any regulation needs to balance you know safety with innovation because there's also the question of China which that's another hot button political topic right this idea that the us is in an existential battle with china over AI and we have to win at all costs therefore the industry must not be tightly regulated."
Weisenthal explains a common argument from the AI industry, framing regulation as a potential hindrance to innovation and a disadvantage in geopolitical competition with China. This perspective suggests that to maintain a competitive edge, AI development should face minimal regulatory oversight.
"The industry, of course, views state-level regulation as an existential threat to their business. So on this episode we speak with Alex about how he views AI and the optimal approach to regulation."
Alloway introduces Alex Bores, a politician targeted by an AI industry super PAC, and frames the conflict as one where the industry perceives state-level AI regulation as a significant threat to its operations. The episode aims to explore Bores's perspective on AI and its regulation.
"The raise act would for the first time put safety standards on advanced AI research they really didn't like that bill -- and so they announced me as public enemy number one and the initial announcement said they're planning to spend multiple millions against me last week they upped it to 10 million I'm hoping if the campaign continues I can use up all the hundred million that they've planned but we'll see where it goes."
Bores explains that the "Raise Act," which he sponsored, aims to establish safety standards for advanced AI research, a move that has drawn significant opposition and financial backing from the AI industry against his campaign. He notes the escalating spending by the industry to counter his legislative efforts.
"The problem is that discovery software right all of the data was there but it was just taught to think of an excel document as a document that a lawyer would read and so there was no way in the software to track those individual loans but if you think of an individual loan as its own object that should be something that can be searched and tracked across the database then that analysis becomes very easy to do."
Bores elaborates on his experience at Palantir, illustrating how a shift in data analysis perspective, from viewing data as documents to viewing individual data points as searchable objects, can unlock complex insights. This approach was crucial in analyzing financial behavior leading up to the Great Recession.
"The same capabilities that will allow us to cure diseases could in the wrong hands allow someone to build a bioweapon and we just need to be thoughtful on how we go."
Bores expresses a dual perspective on AI's potential, acknowledging its remarkable capacity for advancements like curing diseases while also warning of its dual-use nature, where the same capabilities could be exploited for harmful purposes such as bioweapon development. He emphasizes the need for careful consideration in its development and deployment.
Resources
External Resources
Books
- "The New Age of Sexism" - Mentioned in relation to technology and discrimination against women.
Articles & Papers
- "AI Super PAC Leading the Future Say They Are Spinning Up a Multi Million Dollar Effort to Sync Bores's Nascent Primary Campaign to Replace Retiring Manhattan Representative Jerry Nadler" (Political Pro) - Mentioned as the source for information about the AI industry targeting Alex Bores.
People
- Alex Bores - State assembly member and candidate for New York's 12th congressional district, targeted by an AI industry super PAC.
- George Conway - Mentioned as someone who declared candidacy for the 12th district seat.
- Jack Schlossberg - Mentioned as a candidate for the 12th district seat.
- Jerry Nadler - Retiring representative for New York's 12th congressional district.
- Joe Weisenthal - Co-host of the Odd Lots podcast.
- Mark Cuban - Mentioned as a supporter of Alex Bores's campaign.
- Mark Zuckerberg - Mentioned in relation to Meta's AI research.
- Patric - Character from Spongebob Squarepants.
- Spongebob - Character from Spongebob Squarepants.
- Tracy Alloway - Co-host of the Odd Lots podcast.
- Trump - Mentioned in relation to an executive order on AI.
Organizations & Institutions
- Bloomberg - Mentioned as the source for the Odd Lots podcast and newsletter.
- Brooks Running - Mentioned in relation to their Glycerin Max 2 running shoes.
- CDC (Centers for Disease Control and Prevention) - Mentioned in relation to Alex Bores's work tracking epidemics.
- Conde Nast - Mentioned in relation to a difficult subscription cancellation experience.
- Department of Justice - Mentioned in relation to Alex Bores's work on the opioid epidemic and violent crimes.
- Dewalt - Mentioned in relation to Lowe's tool promotions.
- Deepseek - Mentioned as a company using knowledge distillation for AI models and as a potential target of US regulation.
- Disney - Mentioned in relation to an equity investment in OpenAI.
- Dudesons - Mentioned in relation to Julian Edelman and Gronk's morning routine.
- XAI - Mentioned as a frontier AI lab that would be subject to the Raise Act.
- Federal Civilian - Mentioned in relation to Alex Bores's work at Palantir.
- Google - Mentioned as a frontier AI lab and in relation to its Gemini image generator.
- Hunter College - Mentioned as providing an intern to Alex Bores's office.
- IShares - Mentioned in relation to their Volley ETF.
- Leading the Future - A $100 million AI-industry super PAC targeting Alex Bores.
- Lenovo - Mentioned in relation to gaming computer deals.
- Lowe's - Mentioned in relation to their December deal drops and Pro Rewards program.
- Meta - Mentioned as a frontier AI lab.
- New York State Assembly - Mentioned in relation to Alex Bores's legislative work.
- New York State Senate - Mentioned in relation to a crypto bill.
- OpenAI - Mentioned as a frontier AI lab and in relation to funding for Alex Bores's campaign.
- Palantir - Mentioned as a company Alex Bores previously worked for, focusing on data integration and analysis for government.
- Pandora - Mentioned in relation to holiday gift ideas.
- PFF (Pro Football Focus) - Mentioned as a data source for player grading.
- The Spongebob Movie - Mentioned as a holiday comedy event.
- Trump Administration - Mentioned in relation to an executive order on AI.
- US Government - Mentioned in relation to data utilization and policy.
- Veterans Affairs (VA) - Mentioned in relation to Alex Bores's work on hospital staffing.
Tools & Software
- Adobe Acrobat Studio - Mentioned for PDF capabilities and AI assistance.
- Bloomberg Terminal - Mentioned in relation to government data analysis.
- ChatGPT - Mentioned for AI coding assistance.
- C2PA (Content Credential Provenance Authority) - An open-source metadata standard for verifying content authenticity.
- Gemini - Mentioned as an AI image generator.
- HTTP - Mentioned in contrast to HTTPS for secure web connections.
- HTTPS - Mentioned as a secure web connection standard.
- Knowledge Distillation - A technique for training smaller AI models based on larger ones.
- LLM (Large Language Model) - Mentioned in relation to AI capabilities.
- Palantir Ontology - A view of data meaning for better analysis.
- Super PAC - Mentioned in relation to political campaign funding.
Websites & Online Resources
- Alex Bores's Campaign Website - Mentioned as a place to follow his campaign.
- Bloomberg.com/oddlots - Mentioned for the Odd Lots newsletter.
- Discord.gg/oddlots - Mentioned for listener chat.
- Discovercarti.com - Mentioned for information on CAR T-cell therapy.
- Ishares.com - Mentioned for investment information.
- Lenovo.com - Mentioned for deals on gaming computers.
- Lowes.com - Mentioned for December deal drops and Pro Rewards.
- Omnystudio.com/listener - Mentioned for privacy information.
- Pandora.net - Mentioned for Pandora jewelry.
Other Resources
- AI Regulation - Discussed as a major political topic with various approaches.
- CAR T-cell Therapy - Mentioned as a personalized treatment for multiple myeloma.
- Click to Cancel Bill - A New York bill requiring subscriptions to be cancellable the same way they were signed up for.
- Compute - Mentioned as a factor in determining AI model training costs.
- Data Lake - Mentioned as a concept for storing data in one place.
- Data Integration and Analysis - The core function of Palantir.
- Digital Assets - Mentioned in relation to Eric Adams's interest.
- Executive Order on AI - Issued by Trump, potentially preempting state AI regulations.
- Frontier Model - Defined by compute cost and computational operations.
- Gigaflops - A measure of computational performance.
- Great Recession - Mentioned in relation to Alex Bores's work analyzing bank behavior.
- Internet Banking - Mentioned as a past technological concern now commonplace.
- Limited Purpose Trust - A legal structure for crypto companies in New York.
- Mopeds and E-bikes - Discussed in relation to New York registration laws and safety.
- Multiple Myeloma - A type of cancer discussed in relation to CAR T-cell therapy.
- National Rule on AI - Trump's executive order for a single national AI regulation.
- New Age of Sexism - A book discussed in relation to technology and discrimination.
- New York Crypto Regulation - Discussed in relation to limited purpose trusts and bit licenses.
- Nuclear Energy/Fission - Used as an analogy for the potential broad impacts of AI.
- Ontology - A view of data meaning used in analysis.
- Raise Act - A bill proposed by Alex Bores to put safety standards on advanced AI research.
- Subscription Cancellation - Discussed in relation to the "Click to Cancel" bill.
- Telemarketers - Mentioned in relation to a bill to increase fines.
- Tokenization - Mentioned in relation to Eric Adams's interest.
- Transformers (BERT and LASER) - AI technologies used in anti-money laundering.
- Universal Pre-K - Mentioned as a government investment with long-term payback.