mongodb-lens
Categories
Language:
JavaScript
Stars:
77
Forks:
6
MongoDB Lens
MongoDB Lens is a local Model Context Protocol (MCP) server with full featured access to MongoDB databases using natural language via LLMs to perform queries, run aggregations, optimize performance, and more.
Contents
- Quick Start
- Features
- Installation
- Configuration
- Client Setup
- Data Protection
- Tutorial
- Disclaimer
- Support
Quick Start
- Install MongoDB Lens
- Configure MongoDB Lens
- Set up your MCP Client (e.g. Claude Desktop)
- Explore your MongoDB databases with natural language queries
Features
Tools
-
aggregate-data
: Execute aggregation pipelines -
analyze-query-patterns
: Analyze queries and suggest optimizations -
analyze-schema
: Automatically infer collection schemas -
bulk-operations
: Perform multiple operations efficiently (requires confirmation for destructive operations) -
collation-query
: Find documents with language-specific collation rules -
compare-schemas
: Compare schemas between two collections -
count-documents
: Count documents matching specified criteria -
create-collection
: Create new collections with custom options- `create-database`: Create a new database (without switching to it)
-
create-database-and-switch
: Create a new database and switch to it -
create-index
: Create new indexes for performance optimization -
create-timeseries
: Create time series collections for temporal data -
create-user
: Create new database users with specific roles -
current-database
: Show the current database context -
delete-document
: Delete documents matching specified criteria (requires confirmation) -
distinct-values
: Extract unique values for any field -
drop-collection
: Remove collections from the database (requires confirmation) -
drop-database
: Drop a database (requires confirmation) -
drop-index
: Remove indexes from collections (requires confirmation) -
drop-user
: Remove database users (requires confirmation) -
explain-query
: Analyze query execution plans -
export-data
: Export query results in JSON or CSV format -
find-documents
: Run queries with filters, projections, and sorting -
generate-schema-validator
: Generate JSON Schema validators -
geo-query
: Perform geospatial queries with various operators -
get-stats
: Retrieve database or collection statistics -
gridfs-operation
: Manage large files with GridFS buckets -
list-collections
: Explore collections in the current database -
list-databases
: View all accessible databases -
map-reduce
: Run MapReduce operations for complex data processing -
modify-document
: Insert or update specific documents -
rename-collection
: Rename existing collections (requires confirmation when dropping targets)- `shard-status`: View sharding configuration for databases and collections
-
text-search
: Perform full-text search across text-indexed fields -
transaction
: Execute multiple operations in a single ACID transaction -
use-database
: Switch to a specific database context -
validate-collection
: Check for data inconsistencies -
watch-changes
: Monitor real-time changes to collections
Resources
collection-indexes
: Index information for a collectioncollection-schema
: Schema information for a collectioncollection-stats
: Performance statistics for a collectioncollection-validation
: Validation rules for a collectioncollections
: List of collections in the current databasedatabase-triggers
: Database change streams and event triggers configurationdatabase-users
: Database users and roles in the current databasedatabases
: List of all accessible databasesperformance-metrics
: Real-time performance metrics and profiling datareplica-status
: Replica set status and configurationserver-status
: Server status informationstored-functions
: Stored JavaScript functions in the current database
Prompts
-
aggregation-builder
: Step-by-step creation of aggregation pipelines -
backup-strategy
: Customized backup and recovery recommendations -
data-modeling
: Expert advice on MongoDB schema design for specific use cases -
database-health-check
: Comprehensive database health assessment and recommendations -
index-recommendation
: Get personalized index suggestions based on query patterns -
inspector-guide
: Get help using MongoDB Lens with MCP Inspector -
migration-guide
: Step-by-step MongoDB version migration plans -
mongo-shell
: Generate MongoDB shell commands with explanations -
multi-tenant-design
: Design MongoDB multi-tenant database architecture -
query-builder
: Interactive guidance for constructing MongoDB queries -
query-optimizer
: Optimization recommendations for slow queries- `schema-analysis`: Detailed collection schema analysis with recommendations
-
schema-versioning
: Manage schema evolution in MongoDB applications -
security-audit
: Database security analysis and improvement recommendations -
sql-to-mongodb
: Convert SQL queries to MongoDB aggregation pipelines
Other Features
Other Features: Overview
MongoDB Lens includes several additional features:
- Sanitized Inputs: Security enhancements for query processing
- Configuration File: Custom configuration via
~/.mongodb-lens.json
- Connection Resilience: Automatic reconnection with exponential backoff
- JSONRPC Error Handling: Comprehensive error handling with proper error codes
- Memory Management: Automatic memory monitoring and cleanup for large operations
- Smart Caching: Enhanced caching for schemas, collection lists, and server status
Other Features: New Database Metadata
When MongoDB Lens creates a new database via create-database
or create-database-and-switch
tools, it automatically adds a metadata
collection containing a single document. This serves several purposes:
-
MongoDB only persists databases containing at least one collection
-
Records database creation details (timestamp, tool version, user)
-
Captures environment information for diagnostics
Example metadata document
{
"_id" : ObjectId("67d5284463788ec38aecee14"),
"created" : {
"timestamp" : ISODate("2025-03-15T07:12:04.705Z"),
"tool" : "MongoDB Lens v5.0.7",
"user" : "anonymous"
},
"mongodb" : {
"version" : "3.6.23",
"connectionInfo" : {
"host" : "unknown",
"readPreference" : "primary"
}
},
"database" : {
"name" : "example_database",
"description" : "Created via MongoDB Lens"
},
"system" : {
"hostname" : "unknown",
"platform" : "darwin",
"nodeVersion" : "v22.14.0"
},
"lens" : {
"version" : "5.0.7",
"startTimestamp" : ISODate("2025-03-15T07:10:06.084Z")
}
}
You can safely remove this collection once you've added your own collections to the new database.
Installation
MongoDB Lens can be installed and run in several ways:
Installation: NPX
[!NOTE] NPX requires Node.js installed and running on your system (suggestion: use Volta).
The easiest way to run MongoDB Lens is using npx
:
# Ensure Node.js is installed
node --version # Ideally >= v22.x but MongoDB Lens is >= v18.x compatible
# Using default connection string mongodb://localhost:27017
npx -y mongodb-lens
# Using custom connection string
npx -y mongodb-lens mongodb://your-connection-string
[!TIP] If you encounter permissions errors with
npx
try runningnpx clear-npx-cache
prior to runningnpx -y mongodb-lens
(this clears the cache and re-downloads the package).
Installation: Docker Hub
[!NOTE] Docker Hub requires Docker installed and running on your system.
Run MongoDB Lens via Docker Hub:
# Using default connection string mongodb://localhost:27017
docker run --rm -i --network=host furey/mongodb-lens
# Using custom connection string
docker run --rm -i --network=host furey/mongodb-lens mongodb://your-connection-string
# Using "--pull" to keep the Docker image up-to-date
docker run --rm -i --network=host --pull=always furey/mongodb-lens
Installation: Node.js from Source
[!NOTE] Node.js from source requires Node.js installed and running on your system (suggestion: use Volta).
- Clone the MongoDB Lens repository:
git clone https://github.com/furey/mongodb-lens.git
- Navigate to the cloned repository directory:
cd /path/to/mongodb-lens
- Ensure Node.js is installed:
node --version # Ideally >= v22.x but MongoDB Lens is >= v18.x compatible
- Install Node.js dependencies:
npm ci
- Start the server:
# Using default connection string mongodb://localhost:27017 node mongodb-lens.js # Using custom connection string node mongodb-lens.js mongodb://your-connection-string
Installation: Docker from Source
[!NOTE] Docker from source requires Docker installed and running on your system.
- Clone the MongoDB Lens repository:
git clone https://github.com/furey/mongodb-lens.git
- Navigate to the cloned repository directory:
cd /path/to/mongodb-lens
- Build the Docker image:
docker build -t mongodb-lens .
- Run the container:
# Using default connection string mongodb://localhost:27017 docker run --rm -i --network=host mongodb-lens # Using custom connection string docker run --rm -i --network=host mongodb-lens mongodb://your-connection-string
Installation Verification
To verify the installation, paste and run the following jsonrpc message into the server's stdio:
{"method":"resources/read","params":{"uri":"mongodb://databases"},"jsonrpc":"2.0","id":1}
The server should respond with a list of databases in your MongoDB instance, for example:
{"result":{"contents":[{"uri":"mongodb://databases","text":"Databases (12):\n- admin (180.00 KB)\n- config (108.00 KB)\n- local (40.00 KB)\n- sample_airbnb (51.88 MB)\n- sample_analytics (9.46 MB)\n- sample_geospatial (980.00 KB)\n- sample_guides (40.00 KB)\n- sample_mflix (108.90 MB)\n- sample_restaurants (7.73 MB)\n- sample_supplies (968.00 KB)\n- sample_training (40.85 MB)\n- sample_weatherdata (2.69 MB)"}]},"jsonrpc":"2.0","id":1}
MongoDB Lens is now installed and ready to accept MCP requests.
Configuration
Configuration: MongoDB Connection String
The server accepts a MongoDB connection string as its only argument.
Example NPX usage:
npx -y mongodb-lens mongodb://your-connection-string
MongoDB connection strings have the following format:
mongodb://[username:password@]host[:port][/database][?options]
Example connection strings:
- Local connection:
mongodb://localhost:27017
- Connection to
mydatabase
with credentials fromadmin
database:mongodb://username:password@hostname:27017/mydatabase?authSource=admin
- Connection to
mydatabase
with various other options:mongodb://hostname:27017/mydatabase?retryWrites=true&w=majority
If no connection string is provided, the server will attempt to connect via local connection.
Configuration: Verbose Logging
With verbose logging enabled, the server will output additional information to the console.
To enable verbose logging, set environment variable VERBOSE_LOGGING
to true
.
Example NPX usage:
VERBOSE_LOGGING=true npx -y mongodb-lens mongodb://your-connection-string
Example Docker Hub usage:
docker run --rm -i --network=host -e VERBOSE_LOGGING='true' furey/mongodb-lens mongodb://your-connection-string
Configuration: Config File
MongoDB Lens can also be configured via JSON config file: ~/.mongodb-lens.json
Alternatively, set environment variable CONFIG_PATH
to the path of your custom config file.
Example NPX usage:
CONFIG_PATH='/path/to/config.json' npx -y mongodb-lens
Example Docker Hub usage:
docker run --rm -i --network=host -v /path/to/config.json:/root/.mongodb-lens.json furey/mongodb-lens
Example configuration file contents:
{
"mongoUri": "mongodb://username:password@hostname:27017/mydatabase?authSource=admin",
"connectionOptions": {
"maxPoolSize": 20,
"connectTimeoutMS": 30000
}
}
Client Setup
Client Setup: Claude Desktop
To use MongoDB Lens with Claude Desktop:
- Install Claude Desktop
- Open
claude_desktop_config.json
(create it if it doesn't exist):- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
- macOS:
- Add the MongoDB Lens server configuration as per configuration options
- Restart Claude Desktop
- Start a conversation with Claude about your MongoDB data
Claude Desktop Configuration Options
- Option 1: NPX (Recommended)
- Option 2: Docker Hub Image
- Option 3: Local Node.js Installation
- Option 4: Local Docker Image
For each option:
-
Replace
mongodb://your-connection-string
with your MongoDB connection string or omit it to use the defaultmongodb://localhost:27017
.- Set `VERBOSE_LOGGING` to `true` or `false`.
-
To use a custom config file, see Configuration: Config File and adapt option accordingly.
Option 1: NPX (Recommended)
{
"mcpServers": {
"mongodb-lens": {
"command": "/path/to/npx",
"args": [
"-y",
"mongodb-lens",
"mongodb://your-connection-string"
],
"env": {
"VERBOSE_LOGGING": "[true|false]"
}
}
}
}
Option 2: Docker Hub Image
{
"mcpServers": {
"mongodb-lens": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"--network=host",
"--pull=always",
"-e",
"VERBOSE_LOGGING=[true|false]",
"furey/mongodb-lens",
"mongodb://your-connection-string"
]
}
}
}
Option 3: Local Node.js Installation
{
"mcpServers": {
"mongodb-lens": {
"command": "/path/to/node",
"args": [
"/path/to/mongodb-lens.js",
"mongodb://your-connection-string"
],
"env": {
"VERBOSE_LOGGING": "[true|false]"
}
}
}
}
Option 4: Local Docker Image
{
"mcpServers": {
"mongodb-lens": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"--network=host",
"-e",
"VERBOSE_LOGGING=[true|false]",
"mongodb-lens",
"mongodb://your-connection-string"
]
}
}
}
Client Setup: MCP Inspector
MCP Inspector is a tool designed for testing and debugging MCP servers.
[!NOTE] MCP Inspector starts a proxy server on port 3000 and web client on port 5173.
Example NPX usage:
- Run MCP Inspector:
# Using default connection string mongodb://localhost:27017 npx -y @modelcontextprotocol/inspector npx -y mongodb-lens # Using custom connection string npx -y @modelcontextprotocol/inspector npx -y mongodb-lens mongodb://your-connection-string # Using verbose logging npx -y @modelcontextprotocol/inspector -e VERBOSE_LOGGING=true npx -y mongodb-lens # Using custom ports SERVER_PORT=1234 CLIENT_PORT=5678 npx -y @modelcontextprotocol/inspector npx -y mongodb-lens
- Open MCP Inspector: http://localhost:5173
MCP Inspector should support the full range of MongoDB Lens capabilities, including autocompletion for collection names and query fields.
For more, see: MCP Inspector
Client Setup: Other MCP Clients
MongoDB Lens should be usable with any MCP-compatible client.
For more, see: MCP Documentation: Example Clients
Data Protection
To protect your data while using MongoDB Lens, consider the following:
Data Protection: Read-Only User Accounts
When connecting MongoDB Lens to your database, the permissions granted to the user in your connection string dictate what actions can be performed. For exploration and analysis, a read-only user can prevent unintended writes or deletes, ensuring MongoDB Lens can query data but not alter it.
To set this up, create a user with the 'read'
role scoped to the database(s) you're targeting. In MongoDB shell, you'd run something like:
use admin
db.createUser({
user: 'readonly',
pwd: 'eXaMpLePaSsWoRd',
roles: [{ role: 'read', db: 'mydatabase' }]
})
Then, plug those credentials into your MongoDB Lens connection string (e.g. `mongodb://readonly:eXaMpLePaSsWoRd@localhost:27017/mydatabase`). This restricts MongoDB Lens to read-only operations, safeguarding your data during development or testing. It's a simple yet effective way to enforce security boundaries, especially when you're poking around schemas or running ad-hoc queries.
Data Protection: Working with Database Backups
To keep your production data unmodified while leveraging MongoDB Lens for analysis, its suggested to use a backup copy hosted on a separate MongoDB instance. This setup isolates your live environment, letting you experiment with queries or aggregations without risking accidental corruption.
Start by generating a backup with mongodump
. Next, spin up a fresh MongoDB instance (e.g. on a different port like 27018
) and restore the backup there using mongorestore
. Once it's running, point MongoDB Lens to the backup instance's connection string (e.g. mongodb://localhost:27018/mydatabase
).
This approach gives you a sandbox to test complex operationsâlike pipeline-heavy aggregations or schema tweaksâwithout touching your production data. It's a practical choice when you need to dig into your dataset safely, especially in scenarios where live modifications aren't an option.
Data Protection: Confirmation for Destructive Operations
MongoDB Lens implements a token-based confirmation system for potentially destructive operations. This system requires a two-step process for executing commands that could result in data loss:
- First command invocation: Returns a 4-digit confirmation token that expires after 5 minutes
- Second command invocation: Executes the operation if provided with the valid token
Operations that require confirmation include:
-
bulk-operations
: When including delete operations -
delete-document
: Delete one or multiple documents -
drop-collection
: Delete a collection and all its documents- `drop-database`: Permanently delete a database
-
drop-index
: Remove an index (potential performance impact) -
drop-user
: Remove a database user -
rename-collection
: When the target collection exists and will be dropped
This protection mechanism prevents accidental data loss from typos and unintended commands. It's a safety net ensuring you're aware of the consequences before proceeding with potentially harmful actions.
Bypassing Confirmation for Destructive Operations
You might want to bypass the token confirmation system.
Set the environment variable DISABLE_DESTRUCTIVE_OPERATION_TOKENS
to true
to execute destructive operations immediately without confirmation:
# Using NPX
DISABLE_DESTRUCTIVE_OPERATION_TOKENS=true npx -y mongodb-lens
# Using Docker
docker run --rm -i --network=host -e DISABLE_DESTRUCTIVE_OPERATION_TOKENS='true' furey/mongodb-lens
[!WARNING] Disabling confirmation tokens removes an important safety mechanism. It's strongly recommended to only use this option in controlled environments where data loss is acceptable, such as development or testing. Disable at your own risk.
Tutorial
This following tutorial guides you through setting up a MongoDB container with sample data, then using MongoDB Lens to interact with it through natural language queries:
- Start Sample Data Container
- Import Sample Data
- Connect MongoDB Lens
- Example Queries
- Working With Confirmation Protection
Tutorial: 1. Start Sample Data Container
[!NOTE] This tutorial assumes you have Docker installed and running on your system.
[!IMPORTANT] If Docker is already running a container on port 27017, stop it before proceeding.
1. Initialise the sample data container:
```console
docker run --name mongodb-sampledata -d -p 27017:27017 mongo:6
```
- Verify the container is running without issue:
docker ps | grep mongodb-sampledata
Tutorial: 2. Import Sample Data
MongoDB provides several sample datasets which we'll use to explore MongoDB Lens.
- Download the sample datasets:
curl -LO https://atlas-education.s3.amazonaws.com/sampledata.archive
- Copy the sample datasets into your sample data container:
docker cp sampledata.archive mongodb-sampledata:/tmp/
- Import the sample datasets into MongoDB:
docker exec -it mongodb-sampledata mongorestore --archive=/tmp/sampledata.archive
This will import several databases:
sample_airbnb
: Airbnb listings and reviewssample_analytics
: Customer and account datasample_geospatial
: Geographic datasample_mflix
: Movie datasample_restaurants
: Restaurant datasample_supplies
: Supply chain datasample_training
: Training data for various applicationssample_weatherdata
: Weather measurements
Tutorial: 3. Connect MongoDB Lens
Install MongoDB Lens as per the Quick Start instructions.
Set your MCP Client to connect to MongoDB Lens via: mongodb://localhost:27017
[!TIP] Omitting the connection string from your MCP Client configuration will default the connection string to
mongodb://localhost:27017
.
Example Claude Desktop configuration:
{
"mcpServers": {
"mongodb-lens": {
"command": "/path/to/npx",
"args": [
"-y",
"mongodb-lens"
]
}
}
}
Tutorial: 4. Example Queries
With your MCP Client running and connected to MongoDB Lens, try the folowing example queries demonstrating MongoDB Lens's tools, resources, and prompts:
- Example Queries: Basic Database Operations
- Example Queries: Movie Data Analysis
- Example Queries: Airbnb Data Exploration
- Example Queries: Weather Data Operations
- Example Queries: Geospatial Operations
- Example Queries: Time Series & Change Streams
- Example Queries: Bulk Operations & Data Modeling
- Example Queries: Administrative Operations
- Example Queries: Advanced Features
Example Queries: Basic Database Operations
-
"List all available databases" â„ Uses
list-databases
tool -
"What's the current database I'm connected to?" â„ Uses
current-database
tool -
"Switch to the sample_mflix database" â„ Uses
use-database
tool -
"What collections are available in this database?" â„ Uses
list-collections
tool -
"Get statistics for the sample_mflix database" â„ Uses
get-stats
tool with database target -
"Create a new collection called temp_collection" â„ Uses
create-collection
tool -
"Drop the collection temp_collection" â„ Uses
drop-collection
tool with confirmation flow -
"Create a new database called other_database, but stay in the current database" â„ Uses
create-database
tool -
"Create a new database called switch_database and switch to it" â„ Uses
create-database-and-switch
tool -
"Drop the other_database"
â„ Uses `drop-database` tool to get a confirmation token
-
"Drop other_database with token 1234" â„ Uses
drop-database
tool with the token parameter -
"Delete all documents where status equals 'inactive' from the users collection" â„ Uses
delete-document
tool with confirmation flow
Example Queries: Movie Data Analysis
- "Switch back to sample_mflix db and count the movies collection"
â„ Uses
count-documents
tool - "Find the top 5 movies by IMDB rating with a runtime over 120 minutes"
â„ Uses
find-documents
tool with sort and filter - "What's the schema of the movies collection?"
â„ Uses
analyze-schema
tool - "Find distinct countries where movies were produced"
â„ Uses
distinct-values
tool - "Create an index on the title field in the movies collection"
â„ Uses
create-index
tool - "Why is my query for movies with over 1000 votes slow? Help me optimize it"
â„ Uses
query-optimizer
prompt - "Run an explain on the query {year: 1995}"
â„ Uses
explain-query
tool - "Build an aggregation pipeline to show the count of movies by decade and genre"
â„ Uses
aggregation-builder
prompt - "Execute this aggregation pipeline: [{$group: {_id: {$floor: {$divide: ['$year', 10]}}, count: {$sum: 1}}}]"
â„ Uses
aggregate-data
tool - "Update all movies from 1994 to add a 'classic' field set to true"
â„ Uses
modify-document
tool with update operation
Example Queries: Airbnb Data Exploration
-
"Switch to sample_airbnb database" â„ Uses
use-database
tool -
"Get collection statistics for the listingsAndReviews collection" â„ Uses
get-stats
tool with collection target -
"What's the validation rules for the listingsAndReviews collection?" â„ Uses
collection-validation
resource- _"Show me the indexes on the listingsAndReviews collection"_
â„ Uses
collection-indexes
resource -
"Find listings with more than 5 bedrooms in Manhattan, limited to 10 results" â„ Uses
find-documents
tool -
"Get distinct property types in the listings" â„ Uses
distinct-values
tool -
"Help me create a query filter to find superhosts with pool amenities" â„ Uses
query-builder
prompt -
"Export the top 20 highest-rated listings in Brooklyn as CSV with name, price, and rating" â„ Uses
export-data
tool -
"Is my schema optimized for querying by neighborhood? Analyze and give recommendations" â„ Uses
schema-analysis
prompt -
"Rename the reviews collection to guest_reviews" â„ Uses
rename-collection
tool
Example Queries: Weather Data Operations
-
"Switch to sample_weatherdata database" â„ Uses
use-database
tool -
"What's in the schema of the data collection?" â„ Uses
collection-schema
resource -
"Find the highest recorded temperatures with a callLetters of 'SHIP'" â„ Uses
find-documents
tool -
"Validate the data collection for inconsistencies" â„ Uses
validate-collection
tool -
"Insert a new weather record for today" â„ Uses
modify-document
tool with insert operation -
"Create a new collection called weather_summary" â„ Uses
create-collection
tool -
"Create index recommendation for queries that filter by callLetters and sort by date" â„ Uses
index-recommendation
prompt -
"Show me how to write a MapReduce operation to get average temperatures by day" â„ Uses
mongo-shell
prompt -
"Run this MapReduce to calculate average pressure by location" â„ Uses
map-reduce
tool -
"Delete all weather readings below -50 degrees"
â„ Uses `delete-document` tool
Example Queries: Geospatial Operations
- "Switch to sample_geospatial database"
â„ Uses
use-database
tool - "Find all shipwrecks within 5km of the coast of Florida"
â„ Uses
geo-query
tool with near operator - "Show me restaurants that fall within the downtown Manhattan polygon"
â„ Uses
geo-query
tool with geoWithin operator - "Which bike routes intersect with Central Park?"
â„ Uses
geo-query
tool with geoIntersects operator - "Create a geospatial index on the location field of the neighborhoods collection"
â„ Uses
create-index
tool with 2dsphere index type - "Analyze the schema of the shipwrecks collection to understand its geospatial data structure"
â„ Uses
analyze-schema
tool
Example Queries: Time Series & Change Streams
- "Create a new time series collection for sensor readings with 'timestamp' as the time field"
â„ Uses
create-timeseries
tool - "Watch for changes in the orders collection for the next 30 seconds"
â„ Uses
watch-changes
tool - "Monitor all insert operations on the users collection for 15 seconds"
â„ Uses
watch-changes
tool with specific operations - "Create a time series collection for IoT device data with hourly granularity"
â„ Uses
create-timeseries
tool with granularity option - "Create a time series collection that automatically deletes data older than 30 days"
â„ Uses
create-timeseries
tool with expireAfterSeconds option
Example Queries: Bulk Operations & Data Modeling
-
"Switch to sample_training database" â„ Uses
use-database
tool -
"Execute a bulk operation to update multiple post documents to add 'edited' flags" â„ Uses
bulk-operations
tool -
"How should I model a social media application in MongoDB?"
â„ Uses `data-modeling` prompt
-
"Perform a bulk insertion of new product records in the supplies database" â„ Uses
bulk-operations
tool -
"Show me how to use MongoDB Lens with the MCP Inspector" â„ Uses
inspector-guide
prompt -
"What's the optimal data model for a multi-tenant SaaS application with heavy analytical queries?" â„ Uses
data-modeling
prompt
Example Queries: Administrative Operations
- "Switch to the admin database"
â„ Uses
use-database
tool - "Show me the server status"
â„ Uses
server-status
resource - "Display the replica set configuration"
â„ Uses
replica-status
resource - "List all users in the database"
â„ Uses
database-users
resource - "Get any stored JavaScript functions"
â„ Uses
stored-functions
resource - "Perform a security audit on my MongoDB deployment"
â„ Uses
security-audit
prompt - "What's a good backup strategy for my MongoDB instance?"
â„ Uses
backup-strategy
prompt - "How would I migrate from MongoDB 4.4 to 6.0?"
â„ Uses
migration-guide
prompt
Example Queries: Schema Management & Analysis
-
"Compare schemas between the users and customers collections" â„ Uses new
compare-schemas
tool to identify differences -
"Generate a JSON Schema validator for the profiles collection with moderate strictness" â„ Uses new
generate-schema-validator
tool -
"Analyze query patterns for the orders collection" â„ Uses new
analyze-query-patterns
tool -
"What fields are missing in the new customers collection compared to the old one?" â„ Uses
compare-schemas
to analyze migration gaps -
"Are my indexes being used effectively for my queries?" â„ Uses
analyze-query-patterns
to identify optimization opportunities#### Example Queries: Advanced Features
-
"Switch to sample_mflix database" â„ Uses
use-database
tool -
"Search for movies containing the phrase 'space odyssey' using text search" â„ Uses
text-search
tool -
"Find users named 'mĂŒller' using German collation rules" â„ Uses
collation-query
tool -
"List all files in the images GridFS bucket" â„ Uses
gridfs-operation
tool with list operation -
"Get detailed information about the 'profile.jpg' file in GridFS" â„ Uses
gridfs-operation
tool with info operation -
"Delete the 'old_backup.zip' file from the files GridFS bucket" â„ Uses
gridfs-operation
tool with delete operation -
"Check the sharding status of the sample_analytics database" â„ Uses
shard-status
tool with database target -
"View the sharding distribution for the customers collection" â„ Uses
shard-status
tool with collection target -
"Execute a transaction that transfers $100 from account A to account B" â„ Uses
transaction
tool -
"Get real-time performance metrics for my MongoDB server" â„ Uses
performance-metrics
resource -
"Show me the current event triggers in my database" â„ Uses
database-triggers
resource -
"Convert this SQL query to MongoDB: SELECT * FROM users WHERE age > 30 ORDER BY name" â„ Uses
sql-to-mongodb
prompt -
"Perform a comprehensive health check on my database" â„ Uses
database-health-check
prompt -
"Help me design a multi-tenant architecture for my SaaS application" â„ Uses
multi-tenant-design
prompt -
"I need to add user address fields to my schema. How should I version and migrate?" â„ Uses
schema-versioning
prompt
Tutorial: 5. Working With Confirmation Protection
MongoDB Lens includes a safety mechanism for potentially destructive operations. Here's how it works in practice:
- Request to drop a collection:
"Drop the collection named test_collection"
- MongoDB Lens responds with a warning and confirmation token:
â ïž DESTRUCTIVE OPERATION WARNING â ïž You've requested to drop the collection 'test_collection'. This operation is irreversible and will permanently delete all data in this collection. To confirm, you must type the 4-digit confirmation code EXACTLY as shown below: Confirmation code: 9876 This code will expire in 5 minutes for security purposes.
- Confirm the operation by including the confirmation token:
"Drop test_collection with token 1234"
- MongoDB Lens executes the operation:
Collection 'test_collection' has been permanently deleted.
This two-step process prevents accidental data loss by requiring explicit confirmation.
For development environments, this can be bypassed by setting the DISABLE_DESTRUCTIVE_OPERATION_TOKENS
environment variable to true
.
Disclaimer
MongoDB Lens:
- is licensed under the MIT License.
- is not affiliated with or endorsed by MongoDB, Inc.
- is written with the assistance of AI and may contain errors.
- is intended for educational and experimental purposes only.
- is provided as-is with no warrantyâplease use at your own risk.
Support
If you've found MongoDB Lens helpful consider supporting my work through:
Buy Me a Coffee | GitHub Sponsorship
Contributions help me continue developing and improving this tool, allowing me to dedicate more time to add new features and ensuring it remains a valuable resource for the community.
Publisher info
James Furey
Ask. Think. Conceive. Focus. Create. Design. Discuss. Develop. Break. Fix. Seed. Deploy. Study. Tweak. Measure. Refine. Track. Enhance. Engage. Enjoy.
More MCP servers built with JavaScript
MCP Server Semgrep is a [Model Context Protocol](https://modelcontextprotocol.io) compliant server that integrates the powerful Semgrep static analysis tool with AI assistants like Anthropic Claude. It enables advanced code analysis, security vulnerability detection, and code quality improvements directly through a conversational interface.
This MCP server provides email sending functionality using Protonmail's SMTP service. It allows both Claude Desktop and Cline VSCode extension to send emails on your behalf using your Protonmail credentials.
Model Context Protocol server that integrates AgentQL's data extraction capabilities.