|
The Impact A weekly look at the intersection of AI, advocacy, and politics from the team at MFStrategies | www.MFStrategies.com |
|
|
Are you building AI for the future of campaigns or advocacy? We want to include you in our database of companies providing AI services! Our goal is always to help political and policy professionals understand what is out there beyond ChatGPT and what the benefits could be for their organizations. We're in the initial stages of collecting information about different tools, so if you want to be included just fill out our Vendor Fact Sheet and we'll be in touch! The Impact is read by thousands of political and policy professionals who are looking for the best tech to help them move faster - let us help you get in front of them!
|
|
|
|
| | Vendor Scorecards Coming soon |
| |
|
|
|
|
|
|
AI in government is speeding up. GSA opened Meta’s Llama to every agency through OneGov, skipping one-off deals but raising tough questions on security, ATOs, and model risk. Tribal nations are setting their own AI rules around data and culture, and vendors will have to adapt. And even as “agentic” tools promise faster services, leaders want audits, transparency, and humans in the loop. The trend is clear: move fast, but ship strong guardrails. |
|
|
|
AI / Political News of the Week
|
|
| |
|
Meta Newsroom Takeaway GSA added Meta’s Llama open-source AI models to its OneGov initiative, giving all federal agencies streamlined access to generative AI. The move skips one-off negotiations and leans on GSA’s backend vetting so teams can build and test with more control over data and lower costs. The effort ties to America’s AI Action Plan and OMB memos M-25-21 and M-25-22 to speed accountable AI use in government.
Why it matters Government-wide access to an open model could speed pilots and reduce lock-in to closed vendors, but execution details still matter: security reviews, ATO paths, model risk management, and long-term support. Skipping procurement sounds fast, yet agencies will still need plans for hosting, fine-tuning, monitoring, and accountability—either in-house or through integrators. Federal buyers should ask for clear guardrails, performance baselines, and transparency before scaling beyond tests. |
|
| | ASU News Takeaway ASU Law and the Center for Tribal Digital Sovereignty are hosting a Sept. 26 event on “AI in Indian Country,” setting up the spring Wiring the Rez conference. Tribal leaders, lawyers, and technologists will discuss AI through the lens of sovereignty, data rights, and culture. Some nations, like the Cherokee Nation, are already adopting AI policies with governance committees and cultural protections.
Why it matters Sovereign tribal governments are writing their own AI rules, not just following federal or state policy. Decisions on data centers, broadband, and data ownership will shape land, water, culture, and local jobs; vendors and agencies will have to meet rules that differ by nation. University and finance partners can build capacity, but they can also steer agendas, so leaders should expect hard questions about who benefits and who controls the data. |
|
| | Federal News Network Takeaway Agentic AI — tools that can analyze, decide, and take actions — could speed up how agencies track spending, flag problems, and deliver services. The piece argues for clear guardrails: transparency, regular audits, and human-in-the-loop checks. It also urges training public servants in ethics, systems thinking, and judgment, not just tech skills.
Why it matters Agencies are getting a flood of ‘AI agent’ pitches; concrete steps like audits, transparency, and human oversight help leaders spot real value and manage risk. The author leads a major federal contractor, so expect these priorities to show up in proposals and policy talks — but they’re not mandates yet. Guardrails now can prevent repeats of past model failures while still backing high-impact use cases like proactive benefits, job training matches, and outbreak detection. |
|
| |
|
Worth thinking about “No algorithm can decide what kind of society we want to be. That’s still our job.” — Paul Decker |
|
|
|
|