When precision fails and AI wins (Issue 55)

Welcome to Issue 55 of the All Turtles newsletter. Every other week, we bring you carefully chosen news and analysis about AI, startups, and updates from our product teams. If you like this newsletter (we hope you do!), please subscribe or share with a friend.

Dr. Healthbot

We’re going to start seeing significantly more healthbots on the frontlines of medical triage, according to a report from Juniper Research. While the idea of fully handing off medical care to a bot still seems unreasonable in the current state of chatbot development, there’s plenty of room for bots to work in tandem with human medical professionals to alleviate some massive workloads. Healthbots can correlate combinations of symptoms and cross-reference vast amounts of data. New healthbots may be just what the doctor ordered.

ReadHealthbots: the new caregivers (All Turtles)

Action items

We know bias in AI is a significant problem, but what are the solutions? This was the central question at a panel at CES that featured Bärí Williams, All Turtles’ VP of Legal, Policy, and Business Affairs. She and her fellow panelists discussed the role that government should play in AI regulation and the level of transparency that tech companies should adopt when working to eliminate bias. Any solution to this problem should incorporate the very technology that perpetuates it, because AI can also be a tool to detect and mitigate bias in humans.

ReadWhat is the government’s role in regulating AI? (All Turtles)

Knowledge is power—and great responsibility

Although Wikipedia is a contributor website like Twitter or Facebook, why doesn’t it suffer from the same levels of misinformation and fake news? The Wikimedia Foundation’s executive director Katherine Maher joins this episode to talk about the importance of transparency and sticking to your values at scale. She also shares how machine learning will be increasingly important in Wikipedia’s operations, and what organizations using Wikipedia’s datasets should know about the flaws in its information.

ListenEpisode 44: The Wikimedia Foundation’s Katherine Maher (All Turtles)

Lessons learned

For many startup founders, launching a company can feel like forging into uncharted waters, but there are plenty of small startups doing things well that can serve as lighthouses on the stormy seas. Joy, for instance, is a financial decision-making app that has had success in hiring a non-traditional team, and Moven is a banking app that’s changed customer behavior for the better. To learn more, read Fluxx’s compilation of notable startup achievements and what other founders can learn from them.

Read31 lessons from startups doing it better than you (Medium)

AI working smarter, not harder

Researchers at Stanford and Google built a machine learning algorithm to translate aerial images into street maps, but what they created proved far more interesting than any set of directions: an AI the figured out how to cheat. To complete its assigned task, the AI had been hiding information it would need for later, effectively circumventing an more complex process. It’s a reminder that algorithms will do exactly as you tell them, and ascribing the human narrative of “cheating” eschews the essential truth that developers and researchers need to be mindful of absolute precision when programming.

ReadThis clever AI hid data from its creators to cheat at its appointed task (All Turtles)

Jobs 

Please apply, or if you know anyone who may be interested, please forward this on to them.

That’s all for now. If you have suggestions, comments, or just want to say hi, send us an email to hello@all-turtles.com—we read every message.