The Trump administration’s push to transfer TikTok’s U.S. operations into American hands includes plans to retrain its recommendation algorithm on U.S. user data and put control under U.S. investors. But experts caution that simply retraining the model won’t erase the app’s embedded architecture, biases, or content-delivery style stemming from its original design. They argue that algorithmic habits and systemic tendencies may persist unless rebuilt from scratch. Meanwhile, the proposed deal would give Oracle and a U.S. joint venture oversight of data, security, code, and moderation, with Americans filling most seats on a new board.
Sources: The Epoch Times, Forbes
Key Takeaways
– Algorithm retraining alone might not eliminate legacy design and bias implicit in TikTok’s system; deeper rebuilding could be necessary to fully reset behavior.
– The proposed structure places algorithm, data, and moderation under U.S. entities, with the goal of reducing Beijing’s influence, but details of how control is asserted remain vague.
– Observers worry that embedding U.S. governmental or investor influence in algorithmic governance could shift content priorities or open the door to new forms of manipulation.
In-Depth
In recent weeks, the conversation over TikTok’s future in the U.S. has shifted from “ban or keep” to “how do we own and control the algorithm?” Under the latest framework the Trump administration is promoting, TikTok’s U.S. business would become a new, largely American-owned joint venture. The algorithm would be retrained using U.S. user data, housed under U.S. infrastructure (notably with Oracle managing cloud and data security), and placed under U.S. board control — ensuring the parent Chinese firm ByteDance no longer exerts direct algorithmic influence.
But this shift raises more questions than it answers. Optimists emphasize that such a restructuring addresses America’s national security concerns: data isolation, foreign interference, and algorithmic leverage. The structural changes would insulate U.S. accounts from global data flows and give American oversight of how content is prioritized and moderated. The White House fact sheet states that “the divestiture puts the operation of the algorithm, code, and content moderation decisions under the control of the new joint venture.”
Yet, experts warn that retraining an existing algorithm does not necessarily reset its inherent tendencies. Algorithms are shaped not just by data but by architecture, weighting, prompt engineering, network effects, and design choices that reflect assumptions embedded early on. An algorithm optimized under one regime may retain implicit preferences, pathways, or “momentum” that do not disappear with a change in ownership. Analysts suggest a full rebuild might be necessary to avoid carryover biases or behavior patterns.
Moreover, giving a government-backed or U.S. investor-dominated structure authority over algorithmic decisions could introduce its own biases. With decisions about which content surfaces and how users are nudged, even a “neutral” U.S. system would embed values and priorities. Critics caution that assuming a purely benign oversight role risks underestimating how algorithmic choices shape discourse and incentives.
Finally, despite this reorganization salvo, significant uncertainty remains. The public still does not know whether the new algorithm will closely mimic the old one or be fundamentally redesigned, how board oversight will translate to day-to-day algorithmic control, and how transparent the system will be to outside audit or appeal. As negotiations with China and internal structuring continue, the core challenge will be whether the new version of TikTok can truly disentangle itself from its origin while remaining the vibrant, effective content engine that users expect.

