← Back

essay

GitHub Pulse: Week of January 31, 2026

AI agents went mainstream this week. OpenClaw hit 132k stars, MCP is becoming infrastructure, and vectorless RAG is challenging embeddings.

January 31, 2026 · 4 min read · githubtrendingaimcpagents

What developers are building this week, and why it matters.

This Week’s Signal

AI agents went mainstream. Not chatbots—autonomous agents that write code, manage files, and work across platforms. 64% of trending repos are agent or LLM infrastructure focused. Meanwhile, MCP (Model Context Protocol) is quietly becoming infrastructure: two MCP-native projects trending, and “MCP support” is now a feature bullet.

The CLI agent wars are heating up. OpenClaw’s meteoric rise (+15k stars/day) signals that developers want personal AI assistants that run locally and respect data ownership. Meanwhile, the RAG space is fragmenting: PageIndex bets on vectorless retrieval while UltraRAG doubles down on MCP integration.

AI Agents

Three distinct approaches this week: OpenClaw (cross-platform personal assistant), pi-mono (developer-focused toolkit), and kimi-cli (terminal-native agent).

openclaw/openclaw

132k ★ (+15.1k/day) — Personal AI assistant that works across Slack, Discord, WhatsApp, Telegram, and more.

The cross-platform angle is genuinely novel—most AI assistants are single-platform. “Own your data” positioning is resonating hard. The numbers are staggering: 132k stars in two months since launch, with a 14.5% fork ratio proving real usage beyond just stars. This is the year’s breakout AI project so far.

Try it: npm install openclaw

badlogic/pi-mono

4.0k ★ (+218/day) — AI agent toolkit: coding agent CLI, unified LLM API, TUI & web UI libraries, Slack bot, vLLM pods.

Mario Zechner (of libGDX fame) strikes again. This is the infrastructure powering OpenClaw’s coding capabilities. Unified LLM API means you can swap providers without rewriting code. MIT licensed and actively maintained.

Try it: npm install pi-mono

MoonshotAI/kimi-cli

5.5k ★ (+183/day) — Kimi Code CLI is your next CLI agent.

Moonshot AI’s entry into the CLI agent race. Python-based alternative to TypeScript-heavy Western tools. Uses the emerging Agent Client Protocol standard, suggesting interoperability ambitions. Apache-2.0 license.

Try it: pip install kimi-cli

LLM Infrastructure

The race to build the “memory layer” for AI applications is driving massive interest.

supermemoryai/supermemory

16.0k ★ (+243/day) — Memory engine and app that is extremely fast, scalable. The Memory API for the AI era.

Solves the “goldfish memory” problem in AI apps. Multi-provider support (Anthropic, Google, OpenAI, Cerebras) makes it production-ready. Built on Cloudflare’s edge stack for speed. The “second brain” positioning resonates with developers tired of re-explaining context to their AI tools.

Try it: npm install supermemory

OpenBMB/UltraRAG

4.9k ★ (+311/day) — UltraRAG v3: A Low-Code MCP Framework for Building Complex and Innovative RAG Pipelines.

The v3 release with MCP support is driving this spike. “Low-code” RAG pipeline builder that actually works with multiple embedding providers (vLLM, sentence-transformers). From OpenBMB (Tsinghua University’s BigModel lab), so expect solid research backing. Apache-2.0 license makes it enterprise-friendly.

Try it: pip install UltraRAG or claude mcp add UltraRAG

RAG & Document Intelligence

Two contrasting approaches: MCP standardization vs. rethinking fundamentals.

VectifyAI/PageIndex

11.2k ★ (+627/day) — “Vectorless RAG” using tree-structured reasoning instead of embeddings.

The pitch: embeddings find similar text, not relevant text. PageIndex builds a hierarchical index (like a table of contents), then uses LLM reasoning to navigate to the right section.

Why it’s interesting:

  1. Zero vector infrastructure (no Pinecone, no embeddings)
  2. Works on document structure, not just content
  3. Minimal deps—easy to evaluate

The 17x velocity spike suggests developers are actively frustrated with traditional RAG.

Try it: pip install PageIndex

modelcontextprotocol/ext-apps

1.1k ★ (+77/day) — MCP Apps protocol—standard for UIs embedded in AI chatbots, served by MCP servers.

Official MCP extension from the protocol maintainers. SEP-1865 spec just went stable (2026-01-26). This is the building block for rich UI experiences inside Claude and other MCP-compatible assistants. If you’re building MCP servers, this is required reading.

Try it: npm install ext-apps or claude mcp add ext-apps

Developer Tools

remotion-dev/remotion

34.1k ★ (+862/day) — Make videos programmatically with React.

The 51x velocity spike is the highest this week. Five years old but suddenly relevant as AI content creation demands programmatic video pipelines. React-based approach means familiar DX for web developers.

Try it: npx create-video@latest

Pattern Watch

PatternSignal
CLI agents becoming mainstreamThree major projects all trending simultaneously. Terminal-first AI is the new default.
MCP adoption acceleratingUltraRAG and ext-apps both MCP-native. Protocol is becoming infrastructure, not just spec.
Vectorless RAG emergesPageIndex’s approach challenges embedding orthodoxy. Watch for more “reasoning over retrieval” projects.
Memory layer consolidationsupermemory’s rise shows demand for standardized context management across AI tools.

What This Means

Three shifts happening:

  1. From chatbots to agents — Autonomous action, not just generation. OpenClaw’s 132k stars in 2 months proves massive demand.

  2. From embeddings to reasoning — PageIndex represents counter-movement in RAG. Not everything needs vector search.

  3. From protocol to ecosystem — MCP’s growth from spec to infrastructure. “MCP support” is now a feature, not an experiment.

  4. Memory as infrastructure — supermemory and similar projects show the next layer forming: persistent context across AI interactions.

The question: Does OpenClaw’s spike sustain? That tells us if “personal AI assistant” is a real category or just hype.


GitHub Pulse is a weekly analysis of what developers are building. Signal, not noise.