Subscribe to Updates
Get the latest creative news from FooBar about art, design and business.
Tech News
Stay updated with the latest tech news that shapes our world. Discover trends, innovations, and insights in the tech industry.
In a landmark ruling on September 5, 2025, the European Commission fined Google €2.95 billion (about $3.5 billion) for abusing its dominant position in the digital advertising technology sector, finding that it unfairly favored its own ad exchange platforms—AdX and DFP—over those of rivals. The EU demanded that Google cease its self-preferencing practices and submit remedies within 60 days, with a potential requirement to divest parts of its ad-tech business if solutions prove inadequate. The decision marks Google’s fourth major antitrust hit in Europe yet again elevates scrutiny on its market practices as similar probes unfold in the U.S., the U.K., and Canada.
Sources:
The Verge
,
Wall Street Journal
,
AP News
Key Takeaways
– Scale of Regulation: The €2.95 billion sanction underscores the EU’s willingness to levy hefty fines to curb Big Tech over anti-competitive conduct.
– Structural Remedy Pressure: Beyond penalties, the EU is pushing for tangible changes—Google must propose remedies in 60 days, potentially including divestiture.
– Global Antitrust Wave: The case adds to mounting scrutiny on Google’s ad-tech practices globally, with parallel investigations ongoing in the U.S., U.K., and Canada.
In-Depth
On September 5, 2025, the European Commission delivered a clear and firm signal to Big Tech by imposing a €2.95 billion fine on Google, citing abusive conduct tied to its ad-tech platforms. Regulators accused Google of self-preferencing—steering business toward its own ad exchange (AdX) and ad server (DFP) systems, effectively undermining fair competition in the programmatic advertising space. This marks Google’s fourth major sanctions from Brussels in under a decade, following previous rulings on shopping, Android bundling, and search ad placement.
Critically, the EU did more than levy a financial penalty. The Commission gave Google a 60-day window to propose structural remedies—possibly including divestiture if their plan fails to address conflicts of interest. This reflects a broader regulatory posture: penalties alone often aren’t enough to change embedded business behavior; regulators want durable fixes. The decision comes amid a wave of antitrust momentum: U.S. authorities are pursuing their own ad-tech case, while regulators in the U.K. and Canada have launched similar probes. Google, for its part, argued that the decision is wrong and promised to appeal, maintaining that its services foster fair competition.
From a layman’s perspective, these dynamics raise salient questions. On one hand, antitrust enforcement plays a crucial role in maintaining market integrity and protecting competition. On the other, the escalating regulatory burden—especially cross-border—can deter American innovation or invite politicization via trade retaliation, as already seen in Washington’s sharp rebuke. The future hinges on balanced enforcement: fair, predictable rules that punish abuses without knee-jerk overreach. For now, the EU has made clear it’s not just watching—it’s actively reshaping digital economics.
Meta is rolling out a new suite of creator-focused tools on Facebook—namely fan challenges and custom top fan badges—aimed at…
Earthmover Raises $7.2M, Shifting Gear to Become the Snowflake of Weather & Geospatial Data
Earthmover, a climate tech startup co-founded by Ryan Abernathey and Joe Hamman, is retooling its mission to focus on fast-changing…
Europe has officially launched its first exascale supercomputer, Jupiter, located at the Jülich Supercomputing Centre in Germany. This powerhouse can handle more than one quintillion operations per second, making it not only the fastest supercomputer in Europe but the fourth fastest in the world. Backed by a €500 million investment from both the EU and Germany, Jupiter employs cutting-edge Nvidia Grace Hopper superchips and advanced cooling that even recycles heat for campus use. The machine is already earmarked for climate and weather modeling at unprecedented resolution, multilingual large language model development, breakthroughs in medical research like Alzheimer’s and HIV therapy, and even quantum simulation pushing beyond 50 qubits. Leaders from the European Commission welcome Jupiter as a cornerstone in boosting digital sovereignty and accelerating scientific innovation.
Sources:
Reuters
,
Nvidia Blog
,
IT Pro
Key Takeaways
– Strategic Sovereignty & Capability Leap: Jupiter marks a significant step for Europe’s technological independence and global competitiveness in supercomputing.
– Wide-Ranging Scientific Impact: From granular climate modeling to AI, medical research, and quantum computing, Jupiter’s applications are broadly transformative.
– Energy Efficiency & Sustainability: Built with eco-savvy features like energy-efficient chips and heat recycling, Jupiter balances power with responsibility.
In-Depth
Europe’s technological horizon just got a massive power boost with the launch of Jupiter, its inaugural exascale supercomputer, now live at the Jülich Supercomputing Centre in Germany.
Backed by a substantial €500 million investment from both the European Union and Germany, Jupiter steps into the global limelight as the fastest supercomputer in Europe and the world’s fourth fastest. It’s built around the advanced Nvidia Grace Hopper superchips, housed in a cutting-edge liquid-cooled architecture that even harvests waste heat to warm campus facilities—a smart nod toward sustainable engineering.
Jupiter is already driving scientific frontiers. It enables researchers to run climate simulations at one-kilometer scale, unlocking unprecedented accuracy in predictions of heatwaves, storms, and floods. Meanwhile, the University of Lisbon is gearing up to train multilingual AI models that can support all European languages—a meaningful stride toward inclusive AI. On the health front, researchers anticipate breakthroughs targeting severe conditions like Alzheimer’s and HIV. And Jupiter’s capabilities even extend to the quantum realm: it’s expected to handle over 50 qubits, outpacing the previous record of 48 and opening new doors in quantum simulation.
This isn’t just about raw computing power—it’s about asserting European digital sovereignty. By stepping up with advanced infrastructure, Europe stakes its claim in the global high-performance computing space, aiming to close the gap with U.S. and Chinese AI and research capabilities. With energy-efficient performance and broad scientific utility, Jupiter is designed not only to power discoveries but to do so responsibly.
Meta is rolling out a major algorithm update for Facebook Reels, aimed at giving users more influence over their video…
eBay Acquires Nordic Social Marketplace Tise to Boost Gen Z-Focused Resale Platform
eBay has officially announced that it will acquire Tise, a Norway-based social marketplace specializing in secondhand fashion and home décor,…
A recent investigation revealed that the event planning app Partiful failed to strip GPS metadata from user-uploaded photos, meaning that anyone using basic web tools could access precise latitude/longitude data tied to images stored in the app’s Firebase backend. The company acknowledged the flaw, fast-tracked a fix, and reprocessed existing uploads to remove location info. The oversight raised serious concerns—particularly as Partiful has grown into a social graph connecting users’ events, contacts, and locations, and some have already raised alarm over its founders’ prior ties to Palantir.
Sources:
TechSpot
,
TechBuzz
Key Takeaways
– The metadata exposure meant that users’ home, work, or other precise photo-capture locations could have been revealed to anyone inspecting backend image files.
– Partiful responded within days: it stripped metadata from new and existing photos, and publicly disclosed the vulnerability.
– The incident underscores how even startups must treat data hygiene—especially location metadata—as a foundational privacy requirement, not an optional extra.
In-Depth
Partiful is carving out a niche as a hip, minimalist alternative to Facebook Events—a fast, stylish way to plan gatherings and manage RSVPs. But in a recent deep dive, TechCrunch’s security team discovered that Partiful was not automatically stripping geolocation metadata (EXIF GPS tags) from photos that users uploaded, including profile images. These tags—standard in almost every smartphone photo—store precise latitude and longitude coordinates. Because Partiful stored the “raw” images in Google’s Firebase backend, anyone with decent tech savvy could access them through browser developer tools and extract that GPS data.
To validate the issue, TechCrunch uploaded a photo taken outside San Francisco’s Moscone West convention center, which included exact coordinates. When examined on the server side, the photo still carried the same location metadata, confirming that Partiful had not scrubbed it before or during storage. In effect, if someone had snapped a photo at or near their home or workplace and used it as their Partiful profile picture, that location could have been exposed to any user who poked around.
That’s a serious lapse. Most major platforms deploy metadata-stripping by default for privacy reasons; leaving it intact is widely regarded as negligent when you’re storing user photos. Recognizing the gravity, TechCrunch alerted the Partiful team, which acknowledged the issue was “already on our team’s radar” and soon accelerated its fix cycle. Within days, Partiful stripped metadata from new uploads and reprocessed older images to remove sensitive GPS information. The company also publicly disclosed the bug and said it was investigating whether any improper access occurred.
This scandal is more than just a technical slip—it touches on trust, transparency, and the responsibilities that come with collecting user data. It also arrives amid scrutiny over Partiful’s founding team, which includes former Palantir employees. Some critics had already flagged privacy concerns around those ties; this incident amplifies them. As Partiful transitions from a simple event tool toward becoming a social graph (connecting users, tracking interactions, and mapping their events), data safety becomes essential rather than optional.
For users, this is a reminder: disable geographic tagging in your camera app settings, view or strip metadata before posting images, and treat every platform as a potential vector of exposure. For startups—and investors—metadata hygiene should be part of the security baseline, not an afterthought. As Partiful scales, its ability to safeguard sensitive information will be pivotal to whether it becomes a trusted platform or a cautionary tale.
Faith in the Machine: Why More People Are Turning to AI Chatbots for Spiritual Guidance
A recent wave of interest has emerged in people using AI-powered chatbots for religious and spiritual support. According to a…
EchoStar Bows Out of the ‘Fourth Carrier’ Race: $23B Spectrum Sale to AT&T Signals Strategic Shift
Dish (via its parent company EchoStar) has officially exited its long-standing ambition to become the United States’ fourth major wireless…
Chinese researchers led by Xingyu Jiang from the Southern University of Science and Technology have unveiled a prototype “DNA cassette tape” system that promises unbelievable storage density (theoretical limits on the order of 80 million DVDs per kilometer) and multi-century durability — but with write and read speeds so slow that filling even a small fraction of its potential would take ages. The system uses a polyester-nylon composite tape patterned with barcode “tracks” (about 5.45×10⁵ addressable partitions per 1,000 meters) for indexing, synthetic DNA strands are deposited in partitions, then sealed under a protective layer of zeolitic imidazolate framework (ZIF). In proof-of-concept, they stored and recovered a 156.6 KB image (“lantern”) through several operations including deposition, erasure, recovery, and redeposition — but current speed and cost make the system impractical for large-scale use.
Sources:
The Register
,
Science
Key Takeaways
– Massive capacity vs. practical limits: While theoretically this DNA tape could store petabytes per meter (e.g. ~362 PB/km), current measured capacity is far lower, and the throughput (writing/reading) is extremely slow — orders of magnitude below conventional storage media.
– Longevity and stability are strong points: The protective ZIF coating and molecular design suggest retention of data for several hundred years at room temperature; longer in colder conditions. This makes it appealing for cold or archive storage.
– Engineering challenges are real: DNA synthesis and sequencing are costly, and reaction times (encapsulation, decapsulation, deposition, recovery) are slow. Scaling up to fill even a small fraction of the tape’s possible capacity would take impractical amounts of time and resources with today’s tech.
In-Depth
The new DNA cassette tape system represents a striking fusion of old and new: taking the physical format of magnetic tape — known for archival storage — and combining it with the dense information-carrying power of synthetic DNA. Researchers designed a polyester-nylon composite strip, ink-jet printed with barcodes to divide it into hundreds of thousands of partitions per kilometer (about 5.45×10⁵ per 1,000 m), each partition addressable at up to roughly 1,570 partitions per second. Into each hydrophilic partition, synthetic DNA is deposited; a protective zeolitic imidazolate framework (ZIF) is then used to encapsulate it, defending against heat, enzymes, and environmental damage.
In trials, the team successfully demonstrated data deposition and recovery: specifically, a 156.6 KB image of a lantern, stored, retrieved, erased, and redeposited through multiple cycles. However, the time required is daunting: full cycle operations (deposit, recovery, erase etc.) took on the order of tens of minutes to hours. Some steps — like three recoveries and one redeposition — took about 150 minutes, and even with optimizations the fastest possible recovery could be around 47 minutes.
Perhaps most striking is the gap between what’s possible in theory and what’s viable now. Theoretical maximum storage density sits near 362 petabytes per kilometer, equivalent to ~80 million DVDs. But current measured capacity per kilometer is much less (on the order of tens of gigabytes). For example, one experiment shows about 74.7 GB per km in its present form.
That said, the durability prospects are strong: modeling suggests retention of data around 345 years at room temperature, with extended longevity under cooler conditions. That makes the technology especially appealing for cold archive storage (data that’s written rarely and seldom read).
From a conservative, realistic standpoint, the DNA cassette tape is not something you’d deploy today for your main file server or frequent backups. Costs, speed, and scalability are all roadblocks. But in a world where data growth (in science, media, government, history) is exploding, this kind of archival medium may become indispensable. If DNA writing/synthesis and sequencing technologies improve (faster, cheaper), then what seems nearly fantastical now could become a backbone of long-term data preservation.
