Docs Menu

Atlas Data Lake

On this page

  • About Atlas Data Lake
  • Sample Uses
  • Data Lake Access
  • Privilege Actions
  • Authentication Options
  • Atlas Data Lake Regions
  • Billing

MongoDB Atlas Data Lake allows you to natively query, transform, and move data across AWS S3 and MongoDB Atlas clusters. You can query your richly structured data stored in JSON, BSON, CSV, TSV, Avro, ORC, and Parquet formats using the mongo shell, MongoDB Compass, or any MongoDB driver.

You can use Atlas Data Lake to:

  • Convert richly structured MongoDB data into columnar Parquet or CSV files.
  • Query across multiple Atlas clusters to get a holistic view of your data.
  • Materialize aggregations from MongoDB or S3 data.
  • Automatically import data from your S3 bucket into an Atlas cluster.

When you create a Data Lake, you grant Atlas either read only or read and write access to S3 buckets in your AWS account. To access your Atlas clusters, Atlas uses your existing Role Based Access Controls. You can view and edit the generated data storage configuration that maps data from your S3 buckets and Atlas clusters to virtual databases and collections.

A database user must have one of the following roles to query an Atlas Data Lake:

Privilege actions define the operations that you can perform on your Data Lake. You can grant the following Atlas Data Lake privileges:

  • When you create or modify custom roles from the Atlas User Interface
  • In the actions.action request body parameter when you create or update a custom role from the Atlas API
sqlGetSchema

Retrieve the schema stored for a collection or view using the sqlGetSchema command.

sqlSetSchema

Set or delete the schema for a collection or view using the sqlSetSchema command.

viewAllHistory

Retrieve details about the queries that were run in the past 24 hours using $queryHistory.

outToS3

Write data from any one of the supported data stores or multiple supported data stores to your S3 bucket using $out.

storageGetConfig

Retrieve your Data Lake storage configuration using the storageGetConfig command.

storageSetConfig

Set or update your Data Lake storage configuration using the storageSetConfig command.

Data Lake uses SCRAM-SHA or x509 for authentication. It doesn't support LDAP.

Note

To prevent excessive charges on your bill, create your Atlas Data Lake in the same AWS region as your S3 data source.

Atlas Data Lake routes your Data Lake requests through one of the following regions:

Data Lake Regions
AWS Regions
Northern Virginia, North America
us-east-1
Oregon, North America
us-west-2
Ireland, Europe
eu-west-1
London, Europe
eu-west-2
Frankfurt, Europe
eu-central-1
Mumbai, Asia
ap-south-1
Sydney, Australia
ap-southeast-2
Note

You will incur charges when running Atlas Data Lake queries. For more information, see Billing below.

You incur Atlas Data Lake costs for the following items:

  • Storage on the cloud object storage
  • Data processed by Data Lake
  • Data returned by Data Lake

Atlas charges for the total number of bytes that Data Lake processes from your AWS S3 buckets, rounded up to the nearest megabyte. Atlas charges $5.00 per TB of processed data, with a minimum of 10 MB or $0.00005 per query.

You can use partitioning strategies and compression in AWS S3 to reduce the amount of processed data.

Atlas charges for the total number of bytes returned by Data Lake. This total is the sum of the following data transfers:

  • The number of bytes transferred between Data Lake service nodes
  • The number of bytes transferred from Data Lake to the client

Returned data is billed as outlined in the Data Transfer Fees section of the Atlas pricing page. The cost of data transfer depends on the Cloud Service Provider charges for same-region, region-to-region, or region-to-internet data transfer.

Give Feedback
MongoDB logo
© 2021 MongoDB, Inc.

About

  • Careers
  • Legal Notices
  • Privacy Notices
  • Security Information
  • Trust Center
© 2021 MongoDB, Inc.