Sönke Ahrens' How to Take Smart Notes: Modern Systematization of the Zettelkasten Method

Post Title Image (Hand-drawn by Ernest Chiang. You might also be interested in his Ernest PKM workflow.)


1️⃣ Introduction: Making Zettelkasten Learnable and Replicable

In 2017, German scholar Sönke Ahrens published a book that transformed the knowledge management field: How to Take Smart Notes. 1

This book accomplished something important: systematizing, proceduralizing, and making actionable Niklas Luhmann's Zettelkasten method.

Before Ahrens, Zettelkasten was more like a “legend”—we knew Luhmann wrote 70 books using this method, but weren’t quite clear how ordinary people could replicate this system. Luhmann’s own 1981 paper “Communicating with Slip Boxes” was more philosophical reflection than operational manual. (I personally prefer reflection, but many friends have been asking about methods, so I compiled this note.)

“Writing is not what happens after thinking.
Writing is the medium of thinking."

— Sönke Ahrens, How to Take Smart Notes (2017)

Read More

Niklas Luhmann's Original Zettelkasten: Two Slip Boxes, Fixed Numbering, and Communication Partner

Zettelkasten slip box note-taking system (Hand-drawn by Ernest Chiang. You might also be interested in his Ernest PKM workflow.)

1️⃣ Introduction: A Sociologist and His Thinking Machine

Niklas Luhmann (1927–1998) was a German sociologist renowned for his systems theory. During his academic career, he achieved astonishing productivity: 70 books and over 400 scholarly articles.

But even more remarkable, he attributed all of this to what seemed like a simple tool: the Zettelkasten (slip box).

  • This was not an ordinary note-taking system.
    • Luhmann began building this system in the 1950s, eventually accumulating over 90,000 index cards.
  • He called this system his communication partner
    • An external brain capable of dialoguing with him, facilitating thinking, and even “surprising” himself. 1
    • Doesn’t this sound like a manual, century-old version of an AI Agent or knowledge assistant?!

In his famous 1981 paper “Communicating with Slip Boxes” (Kommunikation mit Zettelkästen), Luhmann described in detail how this system worked. But interestingly, much of the modern understanding of the Zettelkasten method is actually mixed with interpretations and adaptations by later scholars.

This note explores the original method that Luhmann himself actually used.


“Without writing, one cannot think;
at least not in a sophisticated, connectable manner.”

— Niklas Luhmann, Kommunikation mit Zettelkästen (1981)1

Read More

KKR and ECP's $50B AI Infrastructure Play

Post Title Image (Illustration: KKR HQ locates at 30 hudson yards in New York. Image source: Photo by Illya Goloborodko.)

✳️ tl;dr

  • News tracking 2024-10-30 1 → 2025-07-30 2
  • KKR is no stranger to the tech industry, having acquired an 80% stake in Philips’ semiconductor division in 2006, which was later renamed NXP Semiconductors. This was a client Ernest once served, leaving a particularly deep impression.

  • 【2024-10-30】KKR and Energy Capital Partners announced a $50 billion strategic partnership focused on accelerating data center, power generation, and transmission infrastructure development to support global AI and cloud computing expansion 1
  • The partnership combines over 8GW of existing data center pipeline and 100GW of operating and development-ready power generation capacity, with KKR owning over 100 data center facilities worldwide 1
  • ECP invests in clean energy asset base, owning and operating over 83GW of power generation capacity in the U.S. market, spanning five asset classes including power generation, renewables, and storage 1
  • The partnership aims to collaborate with utilities, power producers, and data center developers to rapidly and responsibly develop large data center campuses for hyperscalers 1
  • Context: BlackRock launched a $30 billion AI infrastructure fund in the same month, backed by Microsoft and Nvidia, demonstrating capital’s rush into AI infrastructure 1

  • 【2025-07-30】First investment lands: 190MW hyperscale data center campus in Bosque County, Texas, marking 9 months from strategic announcement to first project announcement 2
  • Innovative co-location model: Data center adjacent to Calpine’s Thad Hill Energy Center natural gas power plant, representing the first such dedicated power agreement with a hyperscaler 2
  • Constructed through a joint venture between CyrusOne and ECP, expected to be operational in Q4 2026, with total investment approaching $4 billion, initial IT capacity of 144MW, spanning over 700,000 square feet 2
  • Behind-the-Meter model: Calpine provides 190MW of dedicated power, which can be redirected to support system reliability and local demand during ERCOT grid emergencies 2

  • AI-driven power demand: Goldman Sachs predicts global data center power demand will grow 165% by 2030 compared to 2023, with AI data center power density surging from traditional 5-10kW/rack to 50-200kW/rack 34
  • Grid bottleneck severity: New data center grid connection delays have reached 5 years, while Behind-the-Meter natural gas generation can be deployed within 18-24 months, becoming a pragmatic choice 5
  • Texas regulatory environment: SB6 new legislation (signed June 2025) requires large loads over 75MW to bear grid costs, accept emergency curtailment, and install ERCOT-controlled “kill switches” 67
  • Private equity continues to double down: 2024 data center investment reached $108 billion (triple that of 2023), with KKR's 2021 acquisition of CyrusOne for $15 billion laying the foundation for this partnership 89

Read More

BlackRock-Led Consortium Acquires Aligned Data Centers for $40 Billion in Record AI Infrastructure Deal

Post Title Image (Illustration: Aligned Data Centers. Image source: Aligned Data Centers.)

✳️ tl;dr

  • AIP, MGX, and GIP formed a consortium to acquire Aligned Data Centers for $40 billion 12
  • This marks AIP’s (AI Infrastructure Partnership) first investment since its establishment

  • AIP was founded by BlackRock, Microsoft, NVIDIA, and MGX in September 2024 3
  • AIP targets mobilizing $30 billion in equity capital, with potential to reach $100 billion including debt financing
  • Aligned owns 50 campuses with over 5 GW of operational and planned capacity across key digital hubs in the US and Latin America

  • The hyperscale data center market is projected to reach $167.3 billion in 2025
  • Growing at a 23.58% CAGR to $602.4 billion by 2030 4
  • Global data center electricity consumption is expected to double from 415 TWh in 2024 to 945 TWh in 2030, accounting for 3% of global electricity usage 5
  • Aligned holds over 50 patented cooling technologies, including air, liquid, and hybrid cooling systems designed specifically for high-density AI workloads 16

  • NVIDIA’s latest GB200 chip requires power density up to 120 kilowatts per rack, making liquid cooling essential for racks above 20 kilowatts 78
  • Kuwait Investment Authority and Singapore’s Temasek serve as anchor investors in AIP, demonstrating sovereign wealth funds’ long-term commitment to AI infrastructure 23
  • Notably, Macquarie Asset Management first invested in Aligned in 2018, expanding it from 2 facilities to 50 campuses over 7 years, achieving an exit valuation of approximately $40 billion 910

  • Major tech companies are expected to invest $400 billion in AI infrastructure in 2025, with OpenAI’s Stargate initiative alone reaching $500 billion 211
  • US data center power demand is projected to double from 35 GW in 2024 to 78 GW by 2035, with average hourly electricity consumption tripling 12

  • For you as a manager: Macquarie entered investments (in multiple data center companies) in 2018 and exited 7 years later. As we approach 2026, have you activated your radar to identify targets for the next 3-5 years?

Read More

Off-Balance Sheet AI: How SPVs Are Financing the Data Center Boom While Hiding Leverage

Post Title Image (Photo by Ray Hennessy on Unsplash)

✳️ tl;dr

  • Meta completed nearly $300 billion in financing through SPV structure
  • Building the Hyperion data center in Louisiana, setting a record for the largest private equity transaction in history 1
  • Hyperion data center covers 4 million square feet,
  • When fully operational, it will consume 5 gigawatts of electricity, equivalent to the power consumption of 4 million American households

  • Meta retains only 20% equity yet maintains full operational control,
  • This “control without consolidation” accounting technique keeps $270 billion in debt off the balance sheet 2
  • Equity accounts for only 8.5% of total financing ($2.5 billion/$29.5 billion)
  • Insurance companies invest heavily in such projects through private credit,
  • But face asset-liability mismatch risks and may be forced to liquidate investments during economic downturns 3
  • Historical lesson: In the 1990s, telecom companies laid 80 million miles of fiber optic cables,
  • Four years after the bubble burst, 85%-95% remained unused, earning the nickname “dark fiber4
  • Meta, Amazon, Google, Microsoft committed to a record $320 billion in capital expenditure this year, mostly for AI infrastructure, yet Meta’s 10-K admits: “there can be no assurance that the usage of AI will enhance our products or services” 5

  • Power infrastructure becomes one of the bottlenecks
  • Grid Strategies estimates data centers will need an additional 60 gigawatts of electricity by 2030, equivalent to Italy’s national peak demand 6
  • Cooling technology is also crucial: from air cooling to direct liquid cooling to immersion cooling, affecting long-term operating costs 7

  • Morgan Stanley serves as the exclusive underwriter, while also providing financing advisory for multiple similar projects 1
  • Bonds are issued in 144A format private placement, with a spread of 225 basis points above Treasury bonds 1

  • What can technology managers do? (1) Quantify the actual ROI timeline for AI investments (2) Assess power supply chain risks (3) Maintain appropriate financial leverage ratios

8 9

Read More

The Geography of AI: Anthropic's Economic Index Tracks AI's Real-World Impact Across 150 Countries

Post Title Image (Photo by The New York Public Library on Unsplash)

✳️ tl;dr

  • On 2025-09-15, Anthropic released its third Economic Index (approaching from different dimensions), tracking Claude usage patterns across 150+ countries and all US states for the first time. 1
  • (Possibly the first comprehensive geographic distribution data of AI adoption in the model industry?!)
  • Enterprise API customers show an automation rate of 77%, significantly higher than consumer users’ 50%, indicating that enterprises are actively shifting AI from collaborative tools to productivity replacement solutions.
  • Directive automation jumped from 27% to 39% within 8 months, marking the first time automation (49.1%) surpassed augmentation (47%), reflecting growing user confidence driven by improved model capabilities.

  • API usage shows only 3% price sensitivity (each 1% increase in cost index reduces usage by only 0.29%), with enterprises prioritizing capability and value over cost,
  • The speculated reason is that hidden infrastructure costs far exceed model fees (every $1 in model fees requires an additional $5-10 to deploy and reach production-ready status).
  • Ernest’s field observations align with this: those who complain about token costs typically lack sound organizational operational systems or workflows. Conversely, those who see the overall value created are bold in adopting AI.
  • Approximately 5% of API traffic is dedicated to developing and evaluating AI systems, forming a recursive improvement loop of “AI developing AI,” which is speculated to accelerate capability advancement but also require stronger safety oversight.
  • US interstate GDP elasticity (1.8) is significantly higher than cross-country (0.7), yet income has lower explanatory power, indicating that industry composition and economic structure are stronger adoption drivers.

  • AUI = Anthropic AI Usage Index
  • Washington DC has the highest AUI (3.82), primarily for document editing and information search; California (third) focuses on programming; New York (fourth) prefers financial tasks, with local economic structures directly mapping to AI usage patterns.
  • Educational instruction tasks grew 40% (9% → 13%), scientific research grew 33% (6% → 8%), showing rapid adoption in knowledge-intensive fields, suggesting that high-skilled workers are leveraging AI to enhance professional capabilities.
  • Business management tasks declined 40% (5% → 3%), financial operations tasks halved (6% → 3%), suggesting these fields may be undergoing automation or users are shifting to more specialized tools.
  • Wealthy countries tend to use AI for augmentation, while poorer countries prefer automation, with each 1% increase in population-adjusted usage corresponding to approximately 3% reduction in automation after controlling for task mix.

  • The research uses privacy-preserving classification methods combining the O*NET database (19,498 task descriptions) and Claude’s proprietary classification system for dual verification, ensuring data anonymization.
  • However, its static nature and coarse-grained classification may fail to capture emerging tasks created by AI and programming work of varying complexity. 23
  • The true cost of enterprise AI includes data engineering, security compliance, continuous monitoring, and integration architecture, far exceeding surface-level API fees, which is speculated to explain why enterprises are price-insensitive. 45
  • Model capability improvements (Sonnet 3.6 → 4.x series) directly drive behavioral changes, with better output quality reducing iteration needs, suggesting that future more powerful models may further increase automation ratios and transform human-AI collaboration patterns.

1 4 5 2 6 7 3 8 9

Read More

Amazon Bedrock AgentCore Goes GA: Enterprise-Grade Infrastructure for Production AI Agents

Post Title Image (Caption: Early morning at a coffee shop—croissant, coffee, and conversations about AI Agents and life. Taken at Anchorhead Coffee, Seattle. Image source: Ernest.)

✳️ tl;dr

  • Back in late August, I traveled to North America thinking I’d catch the tail end of summer. Instead, everywhere I went—Bay Area, Seattle—everyone was talking about AI Agents and Agentic Workflows.
  • I was fortunate to get hands-on with Amazon Bedrock AgentCore after the AWS New York Summit.
  • Lucky for me, I had presented on Firecracker microVMs at COSCUP a few years back, so I already knew how powerful and secure this isolation technology is.
  • Back then, I spun up an i3.metal EC2 bare metal instance and managed to launch 4,000 microVMs with Firecracker in under 90 seconds 1, visualizing the entire boot process. Startup speed? Not a concern. (Though it depends on your use case—but I’d argue we don’t need to remind AI Agents that “haste makes waste” :p)
  • Even Cloudflare Containers borrowed Firecracker’s open-source project to power their services 2.

  • Today (2025-10-13), AWS officially launched Amazon Bedrock AgentCore—an enterprise-grade agentic platform designed to help organizations move AI agents from prototype to production 3.
  • The AgentCore SDK has been downloaded over 1 million times, with early adopters including Clearwater Analytics, Ericsson, Sony, Thomson Reuters, and other cross-industry enterprises.
  • Built on microVM technology for enterprise-grade security isolation—each agent session runs in its own isolated virtual machine instance, preventing data leakage and cross-tenant attacks.

  • AgentCore offers composable services supporting multiple frameworks: CrewAI, Google ADK, LangGraph, LlamaIndex, OpenAI Agents SDK, Strands Agents, and more.
  • Works with models on Amazon Bedrock, as well as external models like OpenAI and Gemini.
  • AgentCore Code Interpreter enables agents to safely generate and execute code in isolated environments.
  • AgentCore Browser allows agents to interact with web applications at scale.
  • AgentCore Gateway transforms existing APIs and AWS Lambda functions into agent-compatible tools.
  • Gateway connects to existing MCP servers and integrates third-party tools like Slack, Jira, Asana, and Zendesk.
  • AgentCore Identity enables agents to securely access and operate various tools using OAuth standards.
  • AgentCore Memory helps build context-aware agents without managing complex memory infrastructure.

  • Provides industry-leading security through microVM technology, with each agent session in its own isolated compute environment.
  • AgentCore’s MCP server integrates with IDEs like Kiro or Cursor AI.
  • Offers an industry-leading 8-hour runtime for long-running tasks.

  • Now that it’s GA (Generally Available), no more waiting in queue—just spin it up and start playing!

  • Deploying AI Agents requires integration with existing workflows, fine-tuning, and alignment with organizational goals.
  • For those interested in Process Automation whiteboards, check out the extended reading 4.

  • P.S. On my return trip through Tokyo in September, I actually got hit by the tail end of summer there—scorching hot… Major respect to the Japanese salarymen in full suits. Orz…

Read More

Beyond Efficiency: The Neuroscience Case for Keeping Handwriting in Digital Age

Post Title Image (Illustration: Preparing to unbox the Remarkable Paper Pro. Image source: Ernest.)

✳️ tl;dr

  • In Ernest PKM 1, I mentioned that I still maintain paper-based notes, along with using digital handwritten notes.
  • Last year, I acquired a reMarkable Paper Pro, which is also handwriting-based.
  • There’s always this feeling that when surrounded by heaps of mixed information, whenever I want to quiet down and clarify the complex cognition at hand,
  • I usually grab a handwriting tool, put on noise-canceling headphones, sit quietly for a few minutes, and then start writing to output, categorize, and compare,
  • I can always sort things out. Even if no clear structure emerges, at least some branches grow.
  • But feelings are just feelings. I always want to find a causal explanation (ah, is this a bad habit? Anyway, it’s a habitual action - the root cause habits formed at TSMC are too deeply ingrained).

  • Searching and searching, I found that Dr. Audrey van der Meer 2 has long been focused on this field,
  • Below are her research results published in 2024-01. 34
  • The study recorded brain electrical activity in 36 university students as they were handwriting with a digital pen and typing on a keyboard.
  • Brain connectivity patterns during handwriting were far more complex and elaborate than during typing.
  • Handwriting produced widespread theta/alpha frequency connectivity patterns, which are crucial for memory formation and learning.

Read More

The Westin Building Exchange

Post Title Image (Photo taken during my visit to Amazon Spheres, with The Westin Building visible on the far left. Image source: Ernest.)

✳️ tl;dr

  • I’ve passed by this building countless times while attending meetings or exploring coffee shops and restaurants in downtown Seattle, never realizing its significance.
  • The Westin Building Exchange is a major telecommunications hub facility located in downtown Seattle, Washington. 1
  • The building was constructed in 1981 (around the same era as me XDD),
  • Originally named The Westin Building, it served as the headquarters for the Seattle-based Westin Hotel chain.
  • It also houses the Seattle Internet Exchange (SIX) and the Pacific Northwest Gigapop Pacific Wave Exchange.

  • Since 2019 or earlier, heat generated by the building’s data center has been piped to Amazon’s Doppler building next door, providing heating for Doppler and several other Amazon buildings.
  • Amazon estimates that over the system’s expected 25-year lifespan, it will save 80 million kilowatt-hours of electricity, equivalent to 65 million pounds of coal.

  • The trigger: I came across The Verge’s report on Microsoft’s plan to migrate GitHub entirely to Azure, which made me curious about its infrastructure before the acquisition. 2
  • GitHub was acquired by Microsoft in 2018. So I went back to check the situation in 2017.
  • GitHub’s blog once described: “Those facilities don’t store customer data, rather they’re focused on internet and backbone connectivity as well as direct connect and private network interfaces to Amazon Web Services.” 3
  • GitHub is now (2025) undergoing what may be its largest infrastructure migration ever, planning to move the entire platform from self-owned data centers to Azure within 24 months.
  • The core driver of this migration is the explosive growth of AI workloads: GitHub Copilot generates millions of code suggestions daily, consuming massive computing capacity, and the existing data centers have reached their physical expansion limits. (There’s always an official narrative for the public—take it with a grain of salt) (There’s a kind of cold where grandma thinks you’re cold. There’s a kind of migration where ____ tells you to migrate?!)
  • Migration strategy: Most work needs to be completed within 12 months (because new and old systems need to run in parallel for at least 6 months), GitHub has asked teams to delay feature development and prioritize infrastructure migration.

  • In 2024, GitHub experienced 119 incidents, including 26 major outages, with an average resolution time of about 106 minutes. 4
  • Curious to see how it performs after the move.
  • For those of you tech managers using GitHub, I’m curious—what’s your take? Will this trigger any preparations on your end?

5 6

Read More

From Maker Movement to Industrial AI: How Qualcomm's Arduino Acquisition Reshapes Embedded Computing

Post Title Image (Image source: Arduino UNO Q.)

✳️ tl;dr

  • Qualcomm acquired Arduino, marking the third strategic acquisition following Edge Impulse and Foundries.io, demonstrating Qualcomm’s determination (or strategic positioning?) to build a fullstack edge AI platform 1 2
  • Arduino boasts over 33 million active community members, a developer foundation that competitors like Raspberry Pi don’t have.

  • The newly launched Arduino UNO Q adopts a “dual-brain architecture”: Qualcomm Dragonwing QRB2210 quad-core processor (Quad-core Arm Cortex-A53 @ 2GHz) paired with STM32U585 real-time microcontroller (Arm Cortex-M33 @ 160MHz), simultaneously handling high-performance computing and real-time control.
  • Wireless connectivity = dual-band Wi-Fi 5 + Bluetooth 5.1
  • Edge Impulse integration provides AutoML capabilities, enabling even non-AI experts to deploy machine learning models.
  • The TinyML market is projected to reach $200 billion by 2030, and Qualcomm is positioning itself to capture this high-growth segment?!
  • UNO Q is priced at $44 (2GB RAM/16GB eMMC) to $59 (4GB/32GB), benchmarking against Raspberry Pi 5 at about $7 less. (But RPi 5 now offers 8GB RAM and 16GB RAM options, worth tracking going forward.)

  • Software is licensed under GPL 3.0 or Mozilla Public License, hardware design under CC BY-SA 4.0, meaning you can legally create and sell derivative versions

3 4 5

Read More