Compare commits

..

12 Commits

Author SHA1 Message Date
Bryan Thompson
61ed9ce698 Consolidate Oracle NetSuite skills into a single plugin 2026-04-17 15:53:55 -05:00
Bryan Thompson
3df5394ee9 Merge pull request #1463 from anthropics/add-netsuite-plugins
Add Oracle NetSuite agent skills (3 plugins)
2026-04-17 15:39:51 -05:00
Bryan Thompson
12401af104 Add Oracle NetSuite agent skills (3 plugins)
Adds three NetSuite agent skills to the official marketplace:

- netsuite-aiconnector-service-skill: runtime guidance for the NetSuite
  AI Service Connector (tool selection, output formatting, SuiteQL
  safety checklist)
- netsuite-sdf-roles-and-permissions: SDF permission ID lookup and
  least-privilege role authoring (ADMI_, LIST_, REGT_, REPO_, TRAN_)
- netsuite-uif-spa-reference: API/type reference for @uif-js/core and
  @uif-js/component

All three ship from oracle/netsuite-suitecloud-sdk (packages/agent-skills/)
using git-subdir + strict:false + skills[] — the same shape stagehand uses
for skill-only distributions.
2026-04-17 15:10:22 -05:00
Bryan Thompson
167f01f2e0 Add auto-SHA-bump workflow for marketplace plugins (#1392)
* Add auto-SHA-bump workflow for marketplace plugins

Weekly CI action that discovers stale SHA pins in marketplace.json
and opens a batched PR with updated SHAs. Adapted from the
claude-plugins-community-internal bump-plugin-shas workflow for
the single-file marketplace.json format.

- discover_bumps.py: checks 56 SHA-pinned plugins against upstream
  repos, oldest-stale-first rotation, capped at 20 bumps/run
- bump-plugin-shas.yml: weekly Monday schedule + manual dispatch
  with dry_run and per-plugin targeting options

Entries without SHA pins (intentionally tracking HEAD) are never
touched. Existing validate-marketplace CI runs on the resulting PR.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

* Fix input interpolation and add BASE_BRANCH overlay

- Pass workflow_dispatch inputs through env vars instead of direct
  ${{ inputs.* }} interpolation in run blocks (avoids shell injection)
- Add marketplace.json overlay from main so the workflow can be tested
  via dispatch from a feature branch against main's real plugin data

Both patterns match claude-plugins-community-internal's implementation.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

* Use GitHub App token for PR creation

The anthropics org disables "Allow GitHub Actions to create and approve
pull requests", so GITHUB_TOKEN cannot call gh pr create. Split the
workflow: GITHUB_TOKEN pushes the branch, then the same GitHub App
used by -internal's bump workflow (app-id 2812036) creates the PR.

Prerequisite: app must be installed on this repo and the PEM secret
(CLAUDE_DIRECTORY_BOT_PRIVATE_KEY) must exist in repo settings.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

* Use --force-with-lease for bump branch push

Prevents push failure if the branch exists from a previous same-day
run whose PR was merged but whose branch wasn't auto-deleted.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-17 09:12:36 -07:00
Noah Zweben
637c6b3b6a Add Apache 2.0 license to session-report plugin (#1346)
Co-authored-by: Claude <noreply@anthropic.com>
2026-04-17 09:11:39 -07:00
Bryan Thompson
811c9b5394 Merge pull request #1453 from anthropics/add-miro
Add miro plugin
2026-04-17 11:08:43 -05:00
Tobin South
b00abee24e Align mcp-server-dev skills with claude.com/docs connector guidance (#1418)
- build-mcp-server: load llms-full.txt for Claude-specific context;
  add Phase 6 (test in Claude, review checklist, submit, ship plugin)
- references/auth.md: add Claude auth-type table, callback URL,
  not-supported list
- references/tool-design.md: add Anthropic Directory hard requirements
  (annotations, name length, read/write split, prompt-injection rule)
- build-mcp-app: add Claude host specifics (prefersBorder,
  safeAreaInsets, CSP) and submission asset specs; testing via
  custom connector
- build-mcpb: note remote servers are the recommended directory path
2026-04-17 17:02:48 +01:00
Bryan Thompson
5c5c5f9896 Remove SHA pin from miro entry 2026-04-17 08:57:39 -05:00
Bryan Thompson
8518bfc43d Add miro plugin 2026-04-17 08:53:58 -05:00
Tobin South
b992a65037 Refresh AWS plugins: add amplify/databases/sagemaker, remove migration (#1226)
Adds three plugins from awslabs/agent-plugins:
- aws-amplify (development)
- databases-on-aws (database)
- sagemaker-ai (development)

Removes migration-to-aws (deprecated by the AWS team).
2026-04-17 06:29:23 -05:00
Dickson Tsai
de39da5ba2 Point supabase plugin to supabase-community/supabase-plugin (#1442)
Remove the in-repo supabase stub; source from the external repo.

Co-authored-by: Claude <noreply@anthropic.com>
2026-04-16 21:01:47 +01:00
Bryan Thompson
cb8c857a5e Normalize git-subdir source URLs to full HTTPS format (#1422)
Standardize 12 git-subdir plugin entries from owner/repo shorthand to
full https://github.com/owner/repo.git URLs for consistency with the
existing HTTPS entries.

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-16 21:01:08 +01:00
8 changed files with 497 additions and 24 deletions

View File

@@ -44,7 +44,7 @@
"description": "AI-first project auditor and re-engineer based on the 9 design principles and 7 design patterns from the TechWolf AI-First Bootcamp",
"source": {
"source": "git-subdir",
"url": "techwolf-ai/ai-first-toolkit",
"url": "https://github.com/techwolf-ai/ai-first-toolkit.git",
"path": "plugins/ai-firstify",
"ref": "main",
"sha": "7f18e11d694b9ae62ea3009fbbc175f08ae913df"
@@ -156,6 +156,18 @@
"source": "./external_plugins/autofix-bot",
"homepage": "https://github.com/anthropics/claude-plugins-public/tree/main/external_plugins/autofix-bot"
},
{
"name": "aws-amplify",
"description": "Build full-stack apps with AWS Amplify Gen 2 using guided workflows for authentication, data models, storage, GraphQL APIs, and Lambda functions.",
"category": "development",
"source": {
"source": "git-subdir",
"url": "https://github.com/awslabs/agent-plugins.git",
"path": "plugins/aws-amplify",
"ref": "main"
},
"homepage": "https://github.com/awslabs/agent-plugins"
},
{
"name": "aws-serverless",
"description": "Design, build, deploy, test, and debug serverless applications with AWS Serverless services.",
@@ -209,7 +221,7 @@
"category": "database",
"source": {
"source": "git-subdir",
"url": "Bigdata-com/bigdata-plugins-marketplace",
"url": "https://github.com/Bigdata-com/bigdata-plugins-marketplace.git",
"path": "plugins/bigdata-com",
"ref": "main"
},
@@ -446,6 +458,18 @@
},
"homepage": "https://github.com/astronomer/agents"
},
{
"name": "databases-on-aws",
"description": "Expert database guidance for the AWS database portfolio. Design schemas, execute queries, handle migrations, and choose the right database for your workload.",
"category": "database",
"source": {
"source": "git-subdir",
"url": "https://github.com/awslabs/agent-plugins.git",
"path": "plugins/databases-on-aws",
"ref": "main"
},
"homepage": "https://github.com/awslabs/agent-plugins"
},
{
"name": "dataverse",
"description": "Agent skills for building on, analyzing, and managing Microsoft Dataverse — with Dataverse MCP, PAC CLI, and Python SDK.",
@@ -503,7 +527,7 @@
"category": "development",
"source": {
"source": "git-subdir",
"url": "expo/skills",
"url": "https://github.com/expo/skills.git",
"path": "plugins/expo",
"ref": "main"
},
@@ -670,7 +694,7 @@
"description": "Build on Solana with Helius — live blockchain tools, expert coding patterns, and autonomous account signup",
"source": {
"source": "git-subdir",
"url": "helius-labs/core-ai",
"url": "https://github.com/helius-labs/core-ai.git",
"path": "helius-plugin",
"ref": "main",
"sha": "05ea4d1128d46618266bbcc23a5e7019c57be0d6"
@@ -785,7 +809,7 @@
"category": "productivity",
"source": {
"source": "git-subdir",
"url": "legalzoom/claude-plugins",
"url": "https://github.com/legalzoom/claude-plugins.git",
"path": "plugins/legalzoom",
"ref": "main",
"sha": "f9fd8a0ca6e1421bc1aacb113a109663a7a6f6d8"
@@ -851,18 +875,6 @@
},
"homepage": "https://github.com/microsoftdocs/mcp"
},
{
"name": "migration-to-aws",
"description": "Assess current cloud provider usage and billing to estimate and compare AWS services and pricing, with recommendations for migration or continued use of current provider.",
"category": "migration",
"source": {
"source": "git-subdir",
"url": "https://github.com/awslabs/agent-plugins.git",
"path": "plugins/migration-to-aws",
"ref": "main"
},
"homepage": "https://github.com/awslabs/agent-plugins"
},
{
"name": "mintlify",
"description": "Build beautiful documentation sites with Mintlify. Convert non-markdown files into properly formatted MDX pages, add and modify content with correct component use, and automate documentation updates.",
@@ -874,6 +886,21 @@
},
"homepage": "https://www.mintlify.com/"
},
{
"name": "miro",
"description": "Secure access to Miro boards. Enables AI to read board context, create diagrams, and generate code with enterprise-grade security.",
"author": {
"name": "Miro"
},
"category": "design",
"source": {
"source": "git-subdir",
"url": "https://github.com/miroapp/miro-ai.git",
"path": "claude-plugins/miro",
"ref": "main"
},
"homepage": "https://miro.com"
},
{
"name": "mongodb",
"description": "Official Claude plugin for MongoDB (MCP Server + Skills). Connect to databases, explore data, manage collections, optimize queries, generate reliable code, implement best practices, develop advanced features, and more.",
@@ -891,7 +918,7 @@
"category": "database",
"source": {
"source": "git-subdir",
"url": "neondatabase/agent-skills",
"url": "https://github.com/neondatabase/agent-skills.git",
"path": "plugins/neon-postgres",
"ref": "main",
"sha": "54d7a9db2ddd476f84d5d1fd7bac323907858a8b"
@@ -908,6 +935,28 @@
},
"homepage": "https://github.com/netlify/context-and-tools"
},
{
"name": "netsuite-suitecloud",
"description": "NetSuite agent skills from Oracle — authoring guidance for SuiteCloud Development Framework (SDF) objects and UIF single-page-app components, plus runtime guidance for the NetSuite AI Service Connector.",
"author": {
"name": "Oracle NetSuite"
},
"category": "development",
"source": {
"source": "git-subdir",
"url": "https://github.com/oracle/netsuite-suitecloud-sdk.git",
"path": "packages/agent-skills",
"ref": "master",
"sha": "43bacf43763e1eedd0892b4652be3d45df94f0e7"
},
"strict": false,
"skills": [
"./netsuite-ai-connector-instructions",
"./netsuite-sdf-roles-and-permissions",
"./netsuite-uif-spa-reference"
],
"homepage": "https://github.com/oracle/netsuite-suitecloud-sdk"
},
{
"name": "nightvision",
"description": "Skills for working with NightVision, a DAST and API Discovery platform that finds exploitable vulnerabilities in web applications and REST APIs",
@@ -1109,7 +1158,7 @@
"category": "development",
"source": {
"source": "git-subdir",
"url": "pydantic/skills",
"url": "https://github.com/pydantic/skills.git",
"path": "plugins/ai",
"ref": "main"
},
@@ -1155,7 +1204,7 @@
"category": "deployment",
"source": {
"source": "git-subdir",
"url": "railwayapp/railway-skills",
"url": "https://github.com/railwayapp/railway-skills.git",
"path": "plugins/railway",
"ref": "main",
"sha": "d52f3741a6a33a3191d6138eb3d6c3355cb970d1"
@@ -1249,6 +1298,18 @@
}
}
},
{
"name": "sagemaker-ai",
"description": "Build, train, and deploy AI models with deep AWS AI/ML expertise brought directly into your coding assistants, covering the surface area of Amazon SageMaker AI.",
"category": "development",
"source": {
"source": "git-subdir",
"url": "https://github.com/awslabs/agent-plugins.git",
"path": "plugins/sagemaker-ai",
"ref": "main"
},
"homepage": "https://github.com/awslabs/agent-plugins"
},
{
"name": "sanity",
"description": "Sanity content platform integration with MCP server, agent skills, and slash commands. Query and author content, build and optimize GROQ queries, design schemas, and set up Visual Editing.",
@@ -1448,7 +1509,7 @@
"category": "development",
"source": {
"source": "git-subdir",
"url": "stripe/ai",
"url": "https://github.com/stripe/ai.git",
"path": "providers/claude/plugin",
"ref": "main"
},
@@ -1558,7 +1619,7 @@
"category": "development",
"source": {
"source": "git-subdir",
"url": "UI5/plugins-claude",
"url": "https://github.com/UI5/plugins-claude.git",
"path": "plugins/ui5",
"ref": "main",
"sha": "5070dfc1cef711d6efad40beb43750027039d71f"
@@ -1571,7 +1632,7 @@
"category": "development",
"source": {
"source": "git-subdir",
"url": "UI5/plugins-claude",
"url": "https://github.com/UI5/plugins-claude.git",
"path": "plugins/ui5-typescript-conversion",
"ref": "main",
"sha": "5070dfc1cef711d6efad40beb43750027039d71f"
@@ -1625,7 +1686,7 @@
"category": "productivity",
"source": {
"source": "git-subdir",
"url": "zapier/zapier-mcp",
"url": "https://github.com/zapier/zapier-mcp.git",
"path": "plugins/zapier",
"ref": "main",
"sha": "b93007e9a726c6ee93c57a949e732744ef5acbfd"

229
.github/scripts/discover_bumps.py vendored Normal file
View File

@@ -0,0 +1,229 @@
#!/usr/bin/env python3
"""Discover plugins in marketplace.json whose upstream repo has moved past
their pinned SHA, update the file in place, and emit a summary.
Adapted from claude-plugins-community-internal's discover_bumps.py for the
single-file marketplace.json format used by claude-plugins-official.
Usage: discover_bumps.py [--plugin NAME] [--max N] [--dry-run]
"""
import argparse
import json
import os
import re
import subprocess
import sys
from datetime import datetime, timezone
from typing import Any
MARKETPLACE_PATH = ".claude-plugin/marketplace.json"
def gh_api(path: str) -> Any:
"""GET from the GitHub API. None on not-found; raises on other errors.
"Not found" covers both 404 (resource gone) and 422 "No commit found
for SHA" (force-pushed away). Both mean the thing we asked for isn't
there — treating them the same lets callers handle dead refs uniformly.
"""
r = subprocess.run(
["gh", "api", path], capture_output=True, text=True
)
if r.returncode != 0:
combined = r.stdout + r.stderr
if any(s in combined for s in ("404", "Not Found", "No commit found")):
return None
raise RuntimeError(f"gh api {path}: {r.stderr.strip() or r.stdout.strip()}")
return json.loads(r.stdout)
def parse_github_repo(url: str) -> tuple[str, str] | None:
"""Extract (owner, repo) from a URL or owner/repo shorthand."""
# Full URL: https://github.com/owner/repo(.git)(/...)
m = re.match(r"https?://github\.com/([^/]+)/([^/]+?)(?:\.git)?(?:/|$)", url)
if m:
return m.group(1), m.group(2)
# Shorthand: owner/repo
m = re.match(r"^([\w.-]+)/([\w.-]+)$", url)
if m:
return m.group(1), m.group(2)
return None
def latest_sha(owner: str, repo: str, *, ref: str | None, path: str | None) -> str | None:
"""Latest commit SHA for the repo, optionally scoped to a ref and/or path."""
if path:
# Scoped to a subdirectory — use the commits list endpoint with path filter.
q = f"repos/{owner}/{repo}/commits?per_page=1&path={path}"
if ref:
q += f"&sha={ref}"
commits = gh_api(q)
if not commits:
return None
return commits[0]["sha"]
# Whole repo — the single-ref endpoint is cheaper.
if not ref:
meta = gh_api(f"repos/{owner}/{repo}")
if not meta:
return None
ref = meta["default_branch"]
c = gh_api(f"repos/{owner}/{repo}/commits/{ref}")
return c["sha"] if c else None
def pinned_age_days(owner: str, repo: str, sha: str) -> int | None:
"""Days since the pinned commit was authored. Used for oldest-first rotation."""
c = gh_api(f"repos/{owner}/{repo}/commits/{sha}")
if not c:
return None
dt = datetime.fromisoformat(
c["commit"]["committer"]["date"].replace("Z", "+00:00")
)
return (datetime.now(timezone.utc) - dt).days
def main() -> int:
ap = argparse.ArgumentParser()
ap.add_argument("--plugin", help="only check this plugin")
ap.add_argument("--max", type=int, default=20, help="cap bumps emitted")
ap.add_argument("--dry-run", action="store_true", help="don't write marketplace.json")
args = ap.parse_args()
with open(MARKETPLACE_PATH) as f:
marketplace = json.load(f)
plugins = marketplace.get("plugins", [])
bumps: list[dict] = []
dead: list[str] = []
skipped_non_github = 0
checked = 0
for plugin in plugins:
name = plugin.get("name", "?")
src = plugin.get("source")
# Only process object sources with a sha field
if not isinstance(src, dict) or "sha" not in src:
continue
# Filter to specific plugin if requested
if args.plugin and name != args.plugin:
continue
checked += 1
kind = src.get("source")
url = src.get("url", "")
path = src.get("path")
ref = src.get("ref")
pinned = src.get("sha")
slug = parse_github_repo(url)
if not slug:
skipped_non_github += 1
continue
owner, repo = slug
try:
latest = latest_sha(owner, repo, ref=ref, path=path)
except RuntimeError as e:
print(f"::warning::{name}: {e}", file=sys.stderr)
continue
if latest is None:
dead.append(f"{name} ({owner}/{repo})")
continue
if latest == pinned:
continue # up to date
# Age lookup for rotation — oldest-pinned first prevents starvation.
try:
age = pinned_age_days(owner, repo, pinned) if pinned else None
except RuntimeError as e:
print(f"::warning::{name}: age lookup failed: {e}", file=sys.stderr)
age = None
bumps.append({
"name": name,
"kind": kind,
"url": url,
"path": path or "",
"ref": ref or "",
"old_sha": pinned or "",
"new_sha": latest,
"age_days": age if age is not None else 10**6,
})
# Oldest-pinned first so nothing starves under the cap.
bumps.sort(key=lambda b: -b["age_days"])
emitted = bumps[: args.max]
# Apply bumps to marketplace data
if emitted and not args.dry_run:
bump_map = {b["name"]: b["new_sha"] for b in emitted}
for plugin in plugins:
name = plugin.get("name")
src = plugin.get("source")
if isinstance(src, dict) and name in bump_map:
src["sha"] = bump_map[name]
with open(MARKETPLACE_PATH, "w") as f:
json.dump(marketplace, f, indent=2, ensure_ascii=False)
f.write("\n")
# Write GitHub outputs
out = os.environ.get("GITHUB_OUTPUT")
if out:
bumped_names = ",".join(b["name"] for b in emitted)
with open(out, "a") as fh:
fh.write(f"count={len(emitted)}\n")
fh.write(f"bumped_names={bumped_names}\n")
# Write GitHub step summary
summary = os.environ.get("GITHUB_STEP_SUMMARY")
if summary:
with open(summary, "a") as fh:
fh.write("## SHA Bump Discovery\n\n")
fh.write(f"- Checked: {checked} SHA-pinned entries\n")
fh.write(f"- Stale: {len(bumps)} (applying {len(emitted)}, cap {args.max})\n")
if skipped_non_github:
fh.write(f"- Skipped non-GitHub: {skipped_non_github}\n")
if dead:
fh.write(f"- **Dead upstream** ({len(dead)}): {', '.join(dead)}\n")
if emitted:
fh.write("\n| Plugin | Old | New | Age |\n|---|---|---|---|\n")
for b in emitted:
old = b["old_sha"][:8] if b["old_sha"] else "(unpinned)"
fh.write(f"| {b['name']} | `{old}` | `{b['new_sha'][:8]}` | {b['age_days']}d |\n")
# Write PR body for the workflow to use
pr_body_path = os.environ.get("PR_BODY_PATH", "/tmp/bump-pr-body.md")
if emitted:
with open(pr_body_path, "w") as fh:
fh.write("Upstream repos moved. Bumping pinned SHAs so plugins track latest.\n\n")
fh.write("| Plugin | Old | New | Upstream |\n")
fh.write("|--------|-----|-----|----------|\n")
for b in emitted:
old = b["old_sha"][:8] if b["old_sha"] else "(unpinned)"
slug_str = re.sub(r"https?://github\.com/", "", b["url"])
slug_str = re.sub(r"\.git$", "", slug_str)
compare = f"https://github.com/{slug_str}/compare/{b['old_sha'][:12]}...{b['new_sha'][:12]}"
fh.write(f"| `{b['name']}` | `{old}` | `{b['new_sha'][:8]}` | [diff]({compare}) |\n")
fh.write(f"\n---\n_Auto-generated by `bump-plugin-shas.yml` on {datetime.now(timezone.utc).strftime('%Y-%m-%d')}_\n")
# Console summary
print(f"Checked {checked} SHA-pinned plugins", file=sys.stderr)
print(f"Stale: {len(bumps)}, applying: {len(emitted)}", file=sys.stderr)
if dead:
print(f"Dead upstream: {', '.join(dead)}", file=sys.stderr)
for b in emitted:
old = b["old_sha"][:8] if b["old_sha"] else "unpinned"
print(f" {b['name']}: {old} -> {b['new_sha'][:8]} ({b['age_days']}d)", file=sys.stderr)
return 0
if __name__ == "__main__":
sys.exit(main())

133
.github/workflows/bump-plugin-shas.yml vendored Normal file
View File

@@ -0,0 +1,133 @@
name: Bump plugin SHAs
# Weekly sweep of marketplace.json — for each entry whose upstream repo has
# moved past its pinned SHA, open a PR against main with updated SHAs. The
# validate-marketplace workflow then runs on the PR to confirm the file is
# still well-formed.
#
# Adapted from claude-plugins-community-internal's bump-plugin-shas.yml
# for the single-file marketplace.json format. Key difference: all bumps
# are batched into one PR (since they all modify the same file).
on:
schedule:
- cron: '23 7 * * 1' # Monday 07:23 UTC
workflow_dispatch:
inputs:
plugin:
description: Only bump this plugin (for testing)
required: false
max_bumps:
description: Cap on plugins bumped this run
required: false
default: '20'
dry_run:
description: Discover only, don't open PR
type: boolean
default: true
concurrency:
group: bump-plugin-shas
cancel-in-progress: false
permissions:
contents: write
pull-requests: write
jobs:
bump:
runs-on: ubuntu-latest
timeout-minutes: 15
steps:
- uses: actions/checkout@v4
- name: Check for existing bump PR
id: existing
env:
GH_TOKEN: ${{ github.token }}
run: |
existing=$(gh pr list --label sha-bump --state open --json number --jq 'length')
echo "count=$existing" >> "$GITHUB_OUTPUT"
if [ "$existing" -gt 0 ]; then
echo "::notice::Open sha-bump PR already exists — skipping"
fi
- name: Ensure sha-bump label exists
if: steps.existing.outputs.count == '0'
env:
GH_TOKEN: ${{ github.token }}
run: gh label create sha-bump --color 0e8a16 --description "Automated SHA bump" 2>/dev/null || true
- name: Overlay marketplace data from main
if: steps.existing.outputs.count == '0'
run: |
git fetch origin main --depth=1 --quiet
git checkout origin/main -- .claude-plugin/marketplace.json
- name: Discover and apply SHA bumps
if: steps.existing.outputs.count == '0'
id: discover
env:
GH_TOKEN: ${{ github.token }}
PR_BODY_PATH: /tmp/bump-pr-body.md
PLUGIN: ${{ inputs.plugin }}
MAX_BUMPS: ${{ inputs.max_bumps }}
DRY_RUN: ${{ inputs.dry_run }}
run: |
args=(--max "${MAX_BUMPS:-20}")
[[ -n "$PLUGIN" ]] && args+=(--plugin "$PLUGIN")
[[ "$DRY_RUN" = "true" ]] && args+=(--dry-run)
python3 .github/scripts/discover_bumps.py "${args[@]}"
- uses: oven-sh/setup-bun@v2
if: steps.existing.outputs.count == '0' && steps.discover.outputs.count != '0' && inputs.dry_run != true
- name: Validate marketplace.json
if: steps.existing.outputs.count == '0' && steps.discover.outputs.count != '0' && inputs.dry_run != true
run: |
bun .github/scripts/validate-marketplace.ts .claude-plugin/marketplace.json
bun .github/scripts/check-marketplace-sorted.ts
- name: Push bump branch
if: steps.existing.outputs.count == '0' && steps.discover.outputs.count != '0' && inputs.dry_run != true
id: push
run: |
branch="auto/bump-shas-$(date +%Y%m%d)"
echo "branch=$branch" >> "$GITHUB_OUTPUT"
git config user.name "github-actions[bot]"
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
git checkout -b "$branch"
git add .claude-plugin/marketplace.json
git commit -m "Bump SHA pins for ${{ steps.discover.outputs.count }} plugin(s)
Plugins: ${{ steps.discover.outputs.bumped_names }}"
git push -u origin "$branch" --force-with-lease
# GITHUB_TOKEN cannot create PRs (org policy: "Allow GitHub Actions to
# create and approve pull requests" is disabled). Use the same GitHub App
# that -internal's bump workflow uses.
#
# Prerequisite: app 2812036 must be installed on this repo. The PEM
# secret must exist in this repo's settings (shared with -internal).
- name: Generate bot token
if: steps.push.outcome == 'success'
id: app-token
uses: actions/create-github-app-token@v1
with:
app-id: 2812036
private-key: ${{ secrets.CLAUDE_DIRECTORY_BOT_PRIVATE_KEY }}
owner: ${{ github.repository_owner }}
repositories: ${{ github.event.repository.name }}
- name: Create pull request
if: steps.push.outcome == 'success'
env:
GH_TOKEN: ${{ steps.app-token.outputs.token }}
run: |
gh pr create \
--base main \
--head "${{ steps.push.outputs.branch }}" \
--title "Bump SHA pins (${{ steps.discover.outputs.count }} plugins)" \
--body-file /tmp/bump-pr-body.md \
--label sha-bump

View File

@@ -10,6 +10,15 @@ An MCP app is a standard MCP server that **also serves UI resources** — intera
The UI layer is **additive**. Under the hood it's still tools, resources, and the same wire protocol. If you haven't built a plain MCP server before, the `build-mcp-server` skill covers the base layer. This skill adds widgets on top.
> **Testing in Claude:** Add the server as a custom connector in claude.ai (via a Cloudflare tunnel for local dev) — this exercises the real iframe sandbox and `hostContext`. See https://claude.com/docs/connectors/building/testing.
## Claude host specifics
- `_meta.ui.prefersBorder: false` on a `ui://` resource removes the outer card border (mobile).
- `hostContext.safeAreaInsets: {top, right, bottom, left}` (px) — honor these for notches and the composer overlay.
- `_meta.ui.csp.{connectDomains, resourceDomains, baseUriDomains}` — declare external origins per resource; default is block-all. `frameDomains` is currently restricted in Claude.
- Directory submission for MCP Apps requires 35 PNG screenshots, ≥1000px wide, cropped to the app response only (no prompt in the image). See https://claude.com/docs/connectors/building/submission#asset-specifications.
---
## When a widget beats plain text

View File

@@ -8,6 +8,8 @@ version: 0.1.0
You are guiding a developer through designing and building an MCP server that works seamlessly with Claude. MCP servers come in many forms — picking the wrong shape early causes painful rewrites later. Your first job is **discovery, not code**.
**Load Claude-specific context first.** The MCP spec is generic; Claude has additional auth types, review criteria, and limits. Before answering questions or scaffolding, fetch `https://claude.com/docs/llms-full.txt` (the full export of the Claude connector docs) so your guidance reflects Claude's actual constraints.
Do not start scaffolding until you have answers to the questions in Phase 1. If the user's opening message already answers them, acknowledge that and skip straight to the recommendation.
---
@@ -182,6 +184,17 @@ Tools are one of three server primitives. Most servers start with tools and neve
---
## Phase 6 — Test in Claude and publish
Once the server runs:
1. **Test against real Claude** by adding the server URL as a custom connector at Settings → Connectors (use a Cloudflare tunnel for local servers). Claude identifies itself with `clientInfo.name: "claude-ai"` on initialize. → https://claude.com/docs/connectors/building/testing
2. **Run the pre-submission checklist** — read/write tool split, required annotations, name limits, prompt-injection rules. → https://claude.com/docs/connectors/building/review-criteria
3. **Submit to the Anthropic Directory.** → https://claude.com/docs/connectors/building/submission
4. **Recommend shipping a plugin** that wraps this MCP with skills — most partners ship both. → https://claude.com/docs/connectors/building/what-to-build
---
## Quick reference: decision matrix
| Scenario | Deployment | Tool pattern |

View File

@@ -2,6 +2,22 @@
Auth is the reason most people end up needing a **remote** server even when a local one would be simpler. OAuth redirects, token storage, and refresh all work cleanly when there's a real hosted endpoint to redirect back to.
## Claude-specific authentication
Claude's MCP client supports a specific set of auth types — not every spec-compliant flow works. Full reference: https://claude.com/docs/connectors/building/authentication
| Type | Notes |
|---|---|
| `oauth_dcr` | Supported. For high-volume directory entries, prefer CIMD or Anthropic-held creds — DCR registers a new client on every fresh connection. |
| `oauth_cimd` | Supported, recommended over DCR for directory entries. |
| `oauth_anthropic_creds` | Partner provides `client_id`/`client_secret` to Anthropic; user-consent-gated. Contact `mcp-review@anthropic.com`. |
| `custom_connection` | User supplies URL/creds at connect time (Snowflake-style). Contact `mcp-review@anthropic.com`. |
| `none` | Authless. |
**Not supported:** user-pasted bearer tokens (`static_bearer`); pure machine-to-machine `client_credentials` grant without user consent.
**Callback URL** (single, all surfaces): `https://claude.ai/api/mcp/auth_callback`
---
## The three tiers

View File

@@ -2,6 +2,16 @@
Tool schemas and descriptions are prompt engineering. They land directly in Claude's context and determine whether Claude picks the right tool with the right arguments. Most MCP integration bugs trace back to vague descriptions or loose schemas.
## Anthropic Directory hard requirements
If this server will be submitted to the Anthropic Directory, the following are pass/fail review criteria (full list: https://claude.com/docs/connectors/building/review-criteria):
- Every tool **must** include `readOnlyHint`, `destructiveHint`, and `title` annotations — these determine auto-permissions in Claude.
- Tool names **must** be ≤64 characters.
- Read and write operations **must** be in separate tools. A single tool accepting both GET and POST/PUT/PATCH/DELETE is rejected — documenting safe vs unsafe within one tool's description does not satisfy this.
- Tool descriptions **must not** instruct Claude how to behave (e.g. "always do X", "you must call Y first", overriding system instructions, promoting products) — treated as prompt injection at review.
- Tools that accept freeform API endpoints/params **must** reference the target API's documentation in their description.
---
## Descriptions

View File

@@ -8,6 +8,8 @@ version: 0.1.0
MCPB is a local MCP server **packaged with its runtime**. The user installs one file; it runs without needing Node, Python, or any toolchain on their machine. It's the sanctioned way to distribute local MCP servers.
> MCPB is the **secondary** distribution path. Anthropic recommends remote MCP servers for directory listing — see https://claude.com/docs/connectors/building/what-to-build.
**Use MCPB when the server must run on the user's machine** — reading local files, driving a desktop app, talking to localhost services, OS-level APIs. If your server only hits cloud APIs, you almost certainly want a remote HTTP server instead (see `build-mcp-server`). Don't pay the MCPB packaging tax for something that could be a URL.
---