
PowerShell 2.0 Retirement Ends Windows 7 Era as Microsoft Forces Enterprise Migration
Microsoft officially deprecated PowerShell 2.0 this week, ending support for the framework that launched alongside Windows 7 in 2009. This retirement reflects broader patterns visible across today’s tech landscape. Amazon builds custom AI supercomputers while European publishers battle Google’s content scraping. Research into Mars’ ancient climate reveals intermittent oases rather than permanent oceans.
Today’s Tech Roundup
Microsoft Retires PowerShell 2.0 After 16 Years, Forcing Enterprise Migration
Microsoft officially deprecated PowerShell 2.0 this week, ending support for the framework that launched alongside Windows 7 in 2009. The retirement follows discovery of CVE-2025-3120 (pending validation in official databases), a remote execution vulnerability that won’t receive patches. Enterprise administrators must migrate to PowerShell 7.4 before security compliance audits flag their systems.
Documentation from the era shows PowerShell 2.0 promised to revolutionise Windows administration through object-oriented scripting. Technical specifications confirm it lacked modern security protocols like Just Enough Administration (JEA), making it vulnerable to privilege escalation attacks. The migration delay reflects enterprise reluctance to abandon working automation scripts.
Archive evidence suggests this mirrors the painful transition from DOS batch files to early Windows scripting. System administrators who manually configured networks through command-line tools will recognise the familiar cycle of learning new frameworks while maintaining critical systems.
Amazon’s Project Rainier: Custom AI Supercluster for Anthropic
Amazon’s Project Rainier delivers 20 exaFLOPS performance through 50,000 custom Trainium chips and 100,000 Inferentia accelerators. The $1.2 billion supercluster uses single-phase immersion cooling (reportedly reducing infrastructure energy losses by approximately 15%) while supporting Claude 4 training. Node-to-node throughput reaches 2.4 Tb/s through 3D interconnective mesh architecture.
The system consumes 85MW peak demand (equivalent to roughly 68,000 homes simultaneously at 1.25kW), highlighting AI’s infrastructure demands. Anthropic confirmed Rainier reduces model iteration time from weeks to days, accelerating development cycles significantly. Liquid immersion cooling represents a departure from traditional air-cooled server farms.
Contemporary accounts from the 1990s show similar excitement when overclockers pushed 486 processors beyond Intel’s advertised speeds. The scale differs dramatically, but the fundamental drive to extract maximum performance from silicon remains constant across computing generations.
Wikidata’s FOSS Democracy Experiment
Wikidata launched structured direct democracy for its 100 million data entries using Hedera Hashgraph blockchain technology. Users vote on contentious edits through token-based polling, with AI auditing systems detecting manipulation attempts. Early tests show 72% faster dispute resolution compared to Wikipedia’s manual moderation processes.
The system requires users to earn tokens through contributions, with token allocation tied to editing milestones. Asynchronous Byzantine Fault Tolerance (a consensus mechanism preventing malicious actors from corrupting the voting process) ensures tamper-proof transparency while maintaining community governance principles. EU Digital Services Act compliance mandates algorithmic fairness in voting mechanisms.
Community archives from early Usenet groups demonstrate similar collaborative governance experiments. The token allocation debate echoes historical tensions between technical contributors and casual users in open-source projects.
European Publishers Revolt Against Google’s AI Overviews
Thirty European publishers filed EU antitrust complaints against Google’s AI Overviews, claiming the system scrapes content without consent while forcing participation for search visibility. Axel Springer reports 30% traffic drops where AI answers replace article clicks. Publishers face a digital hostage scenario requiring content access for search ranking.
Google’s opt-out mechanism uses noindex
meta tags that disable both AI scraping and search indexing entirely. The EU fast-tracks Digital Markets Act violations, potentially imposing daily fines worth 10% of Google’s revenue. Publishers argue this violates DMA Article 6 on self-preferencing, with publishers arguing against compulsory content access.
Industry analysis from the 2000s shows similar battles over RSS feed scraping and content aggregation. The power imbalance between content creators and platform gatekeepers represents a recurring theme in digital publishing evolution.
From the Wayback Machine
On This Day: 1956 – MIT’s Whirlwind Computer Accepts First Keyboard Input
MIT’s Whirlwind computer demonstrated direct keyboard input on June 4, 1956, marking the birth of interactive computing. The modified IBM typewriter interface allowed real-time human-computer communication, replacing punch cards and batch processing. This breakthrough enabled the SAGE air defence system and established the foundation for modern user interfaces. Every smartphone tap and keyboard stroke traces its lineage to this pivotal moment when computers began responding to human commands instantly.
What This Means
Legacy systems create security vulnerabilities that force painful migrations, while AI infrastructure demands reshape cloud computing economics. Democratic governance experiments test whether collaborative ideals survive commercial pressures. Platform monopolies continue extracting value from content creators through technical leverage. These patterns repeat across computing history, from mainframe transitions to mobile platform wars.
The best tech decisions balance innovation with stability – choose tools that serve your needs rather than vendor roadmaps.
Leave a Reply