Compliance as an Enabler, not a Blocker

Dr. Marshall’s thought leadership centers on the practical realities of governing advanced technologies at scale—particularly where AI innovation, regulatory expectations, and business outcomes intersect. Her work emphasizes that policy is not a constraint on AI success, but a foundational pillar for achieving sustainable, trustworthy, and profitable AI deployment.

She is the published author of The AI Profitability Gap, which examines why the majority of AI initiatives fail to deliver return on investment and argues that governance, policy alignment, and regulatory readiness are critical enablers of long-term value. The book reframes AI profitability as an execution challenge—one that requires integrating policy, risk management, and compliance directly into how AI systems are designed, deployed, and scaled.

Beyond authorship, Dr. Marshall actively engages with think tanks and U.S. policymakers, including Congressional stakeholders, to help shape emerging AI governance and regulatory approaches. Her contributions focus on bridging the gap between legislative intent and operational feasibility—ensuring that AI policy supports innovation while establishing clear, enforceable accountability mechanisms.

Dr. Marshall also serves as a lecturer and educator on operationalizing technology regulation, translating complex regulatory frameworks into practical guidance for engineers, compliance professionals, and organizational leaders. Her teaching emphasizes how standards and regulations—such as emerging AI laws and risk management frameworks—can be implemented through training, controls, validation, evidence, and reporting practices that scale in fast-paced, highly regulated environments.

Across these efforts, her thought leadership is defined by applied governance: moving beyond theory to show how policy, compliance, and integrity can be operationalized in real systems to support responsible innovation, regulatory trust, and measurable business outcomes.