Language Server Protocol and collection of language tools for Postgres
21 Mar 2025 · 8 minute read
Fauna recently announced they will sunset their product by the end of May 2025, prompting engineering teams to find reliable alternatives quickly. Supabase offers a natural migration path for Fauna users, providing a robust, scalable, and open-source alternative built on Postgres.
Fauna was known for its serverless database model, offering easy scalability, flexible data modeling, and integrated GraphQL APIs. Teams depending on Fauna must now evaluate alternatives carefully, considering impacts on data modeling, querying, and backend logic.
Migrating away from Fauna requires adjustments in query logic, schema definition, and overall application architecture.
Supabase is an open-source Postgres development platform that offers:
Migrating across data structures can be difficult, and normalizing large sets of unstructured or semi-structured data can take time. Given the May 30th Fauna Sunset deadline, we recommend a two-phase approach to ensure your application stays online.
In this phase, your data is safely moved to Supabase before the Fauna sunset date and your applications will still function properly.
In this phase, with your data secured and your applications still functional, you can safely and confidently complete the transition to Supabase.
Phase 1 of the Fauna to Supabase migration focuses on exporting your data from Fauna, importing into Supabase as a JSONB data type, and rewriting your data APIs to use the Supabase SDK.
Fauna allows exporting collections through their admin dashboard or CLI. Use the Fauna CLI to export your collections to Amazon S3 in JSON format:
fauna export create s3 \
--database <database_name> \
--collection <collection_name> \
--bucket <s3_bucket_name> \
--path <s3_bucket_path> \
--format simple
Fauna has also provided instructions using the Fauna Query Language.
Create a table in Supabase with a JSONB column to store raw Fauna documents:
create table fauna_users_raw (
id uuid primary key default gen_random_uuid(),
data jsonb not null
);
Then, ingest the exported JSON data into this Supabase table using this custom script:
import { createClient } from '@supabase/supabase-js'
import fs from 'fs'
const supabaseUrl = 'YOUR_SUPABASE_URL'
const supabaseKey = 'YOUR_SUPABASE_API_KEY'
const tableName = 'YOUR_TABLE_NAME'
const jsonFilePath = './filename.json'
const supabase = createClient(supabaseUrl, supabaseKey)
async function loadDocumentsToSupabase() {
try {
// Read JSON file
const rawData = fs.readFileSync(jsonFilePath)
const dataArray = JSON.parse(rawData).map((data) => ({ data }))
// Insert data into Supabase
const { error } = await supabase.from(tableName).insert(dataArray)
if (error) {
console.error('Error inserting data:', error)
return
}
console.log(`Successfully inserted ${dataArray.length} records into ${tableName}`)
} catch (error) {
console.error('Error in process:', error)
}
}
loadDocumentsToSupabase()
Once your data has been structured into tables, Supabase automatically generates REST APIs for each table via PostgREST, allowing effortless querying from your application.
Here's a Fauna query example (using FQL) for obtaining data from a users table:
import { Client, fql } from 'fauna'
const client = new Client({ secret: '<your-fauna-secret>' })
const usersQuery = fql`
users.all() {
name,
email
}
`
client
.query(usersQuery)
.then((data) => console.log(data))
.catch((error) => console.error('Error fetching users:', error))
And here's the equivalent Supabase REST API call:
import { createClient } from '@supabase/supabase-js'
const supabase = createClient('https://<your-project>.supabase.co', '<your-api-key>')
const { data, error } = await supabase.from('users').select(`
user: metadata->user
`)
// the -> operator returns values as jsonb for the user collection
if (error) console.error(error)
else console.log(data)
Once you have brought your collections over to Supabase, you may find you would benefit from data normalization. As Supabase is built on top of Postgres, having normalized data will lead to significant performance benefits that cannot be matched by a set of collections stored in JSONB.
Once your data is imported as JSONB, leverage the powerful Postgres JSON functions to incrementally normalize and populate relational tables. In this example, we're importing data from a rudimentary users table:
-- Example normalization for users
INSERT INTO users (name, email)
SELECT
data->'data'->'name' AS name,
data->'data'->'email' AS email
FROM fauna_users_raw;
-- Example normalization of nested orders
INSERT INTO orders (user_id, product, quantity)
SELECT
u.id,
order_data->>'product',
(order_data->>'quantity')::INTEGER
FROM fauna_users_raw f
JOIN users u ON (f.data->'data'->>'email') = u.email,
LATERAL jsonb_array_elements(f.data->'data'->'orders') AS order_data;
Once your data has been structured into tables, Supabase automatically generates REST APIs for each table via PostgREST, allowing effortless querying from your application.
Here's the PostgREST query for JSONB data from Phase 1:
import { createClient } from '@supabase/supabase-js'
const supabase = createClient('https://<your-project>.supabase.co', '<your-api-key>')
const { data, error } = await supabase.from('users').select(`
user: metadata->user
`)
// the -> operator returns values as jsonb for the user collection
if (error) console.error(error)
else console.log(data)
And here's the equivalent Supabase REST API call with normalized data:
import { createClient } from '@supabase/supabase-js'
const supabase = createClient('https://<your-project>.supabase.co', '<your-api-key>')
const { data, error } = await supabase.from('users').select('name, email')
if (error) console.error(error)
else console.log(data)
Once your data is migrated, you can start to use Supabase to its fullest:
This is no doubt a stressful time as you transition away from Fauna. Supabase is here to help you every step of the way. Reach out to us and we can help you plan your transition and provide assistance.
Supabase is a comprehensive, scalable replacement for Fauna. Supabase is built on Postgres and offers a robust relational model, powerful security features, and predictable pricing. Supabase enables engineering teams to confidently transition away from Fauna thanks to its SQL ecosystem, more mature/better tooling, row level security, strong typescript support, and full ACID compliance. Thoughtful planning and methodical execution will ensure a seamless migration and long-term reliability.
A guide to migrating from the MongoDB Data API to Supabase.
A dedicated pgbouncer instance that's co-located with your database for maximum performance and reliability.
Learn how to use pgRouting as a lightweight graph database solution in Postgres.
Announcing the winners of the Supabase Launch Week 13 Hackathon.
Announcing the winners of the Supabase AI Hackathon.
Calendar data integration with Cal.com using Wasm foreign data wrapper on Supabase
19 Dec 2024 · 16 minute read
By Stephen Morgan, Security Engineer
We just concluded our first hack-the-base challenge and my first publicly accessible challenge.
Almost 300 people signed up and we had all sorts of mayhem. From challenge accounts passwords being reset, unprecedented email bounce rates, databases blocking poolers, and leaderboard controversies. There wasn't an issue that could have gone wrong that didn't!
Putting all that aside, congratulations to all those that participated. We had a lot of fun and hope you did, too.
Despite only announcing the challenge several days before launch, we saw some impressive participation from the community.
You will need the following tools installed on your computer to follow along in this walkthrough:
To follow along with this walkthrough practically, you will need to run the challenge app locally. Clone this repo here and follow the README instructions in the dec-24 directory. You will only need to run the web app locally as the Supabase backend project is still up and usable (for the foreseeable future).
Our first flag was intended to get you on the board, nothing too hard here, just enough to get you thinking about where we may have hidden the others.
As the name of the title of the flag implies, this one was tucked away in a hidden div within the html of the front page.
There are many ways to view the raw html or DOM of a website. Making a curl request directly would have returned the raw html that you could have grep'ed, making the request through burp suite would have allowed you to inspect the html under "Proxy", "HTTP History", but perhaps an even easier way is to use the inspect tool of our browser.
Within your browser, right click the home page and select "Inspect". This will bring up the DevTools panel where it will present the current DOM. From here use the search function (CMD + F) and type "flag::"
Did I deliberately include the world's largest SVG on the page to throw you off? I'll never tell.
robots.txt is the filename used for implementing the Robots Exclusion Protocol, a standard used by websites to indicate to visiting web crawlers and other web robots which portions of the website they are allowed to visit.
Coincidentally this file can be a helpful source of information to understand a website better!
For this flag, we included the path of an unreferenced page in this file. To find it, you needed to find the path by navigating to /robots.txt. The flag could be found at /super-private-do-not-enter
The Domain Name System (DNS) is the system that startups use to register .io websites. It is also a great source of information. Reverse lookups of IP addresses can tell us what hosting service an app is using, MX records can tell us what email provider they are using, and TXT records can give us some insight into supporting services they are using that aren't necessarily obvious.
For this flag we just needed to query the domains TXT records like so:
dig TXT miskatonicuniversity.us
This would return the flag in the "Answer Section" of the result.
This one was a little more involved. If you were observing the requests that the web app was making you would have seen a request to the apps PostGREST endpoint. The URL will look something like https://SUPABASE_PROJECT.supabase.co/rest/v1/TABLE_NAME?QUERY
In DevTools, navigate to "Network" and refresh the home page. This query https://ubssibzsnwtwbbvbhbvb.supabase.co/rest/v1/bulletins?select=*&date=gt.2001-01-01 is scoped to only return bulletins from 2001 or later.
What happens when we remove this scope and request the whole table? We find the flag of course.
Our only option in Chrome is to copy this request as a cURL command and make the modifications in the CLI. If you are a member of the Firefox master race, you can use the "Edit and Resend" feature to make modifications to the request directly in the Developer Console.
We don't need all the headers included in the cURL command so here is the command where we have also deleted the date scope:
curl 'https://ubssibzsnwtwbbvbhbvb.supabase.co/rest/v1/bulletins?select=*' -H 'accept-profile: public' -H 'apikey: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6InVic3NpYnpzbnd0d2JidmJoYnZiIiwicm9sZSI6ImFub24iLCJpYXQiOjE3MzEwMTYxMjksImV4cCI6MjA0NjU5MjEyOX0.hUbR1lqlGnQZlvjOgt4Wzn9iHidJP4iVHyWE76d9FUk' -H 'authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6InVic3NpYnpzbnd0d2JidmJoYnZiIiwicm9sZSI6ImFub24iLCJpYXQiOjE3MzEwMTYxMjksImV4cCI6MjA0NjU5MjEyOX0.hUbR1lqlGnQZlvjOgt4Wzn9iHidJP4iVHyWE76d9FUk'
This will return a json array of all the bulletins, the oldest of which includes our flag!
We can do better this though, so before we move onto the next flag, lets clean up our CLI by putting the Supabase anon key into an environmental variable:
export ANON_KEY='eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6InVic3NpYnpzbnd0d2JidmJoYnZiIiwicm9sZSI6ImFub24iLCJpYXQiOjE3MzEwMTYxMjksImV4cCI6MjA0NjU5MjEyOX0.hUbR1lqlGnQZlvjOgt4Wzn9iHidJP4iVHyWE76d9FUk'
Now when we make curl requests we only need to use the variable name like so:
curl 'https://ubssibzsnwtwbbvbhbvb.supabase.co/rest/v1/bulletins?select=*' \
-H 'accept-profile: public' \
-H "apikey: $ANON_KEY" \
-H "authorization: Bearer $ANON_KEY"
If you are feeling fancy, you could use Burp Suite's Proxy Intercept feature to modify the PostGREST request in flight from your browser, delete the scope, and view the flag directly in your browser. I will leave this as an avenue for you to explore yourself.
This one should have been easy but I was feeling malicious on the day I made it. So decided to throw in a little extra spice.
If you had explored the "Sign Up" page, you would have noticed the message that only Miskatonic University staff can register, if you didn't have a miskatonicuniversity.us email address you couldn't register.
Unfortunately being a consultant first and developer second, I chose to only validate this requirement on the client side, meaning we can "Inspect" the DOM of the button and re-enable the button action.
Fill out the form with your account information, check the hCaptcha, find the button element in the DOM, double click the "disabled" property and delete it. Simple!
Or at least it would have been simple if you did this during a quiet period. For reasons known only to my coffee levels that morning, this Supabase project did not have Custom SMTP setup, meaning a very strict rate limit of two emails per hour.
If you tried this during a busy period there was an extra step of setting up a script to bot the registration. The hCaptcha field was checked for but didn't need to be valid so this was slightly easier than it sounds.
Unfortunately, more than half of the successful registrations were for people trying to register a miskatonicuniveristy.us email account, so I needed to apply an emergency fix to filter out these attempts so they wouldn't consume the SMTP limit (and Supabase's bounce rate).
I won't go into details about how you could have scripted this, you could have used your language of choice (i.e. java), or used the Intruder feature of Burp Suite.
Regardless of how hard your registration journey was, once you receive the confirmation email in your inbox the hunt is almost over. Clicking the link in the email will confirm your account with the Supabase project, optionally you could change the redirect link to the real URL of the app but your account will be registered regardless.
From here you just needed to login to the app and be presented the flag.
If you had got this far, you may have noticed an error message when querying the PostGREST API of the project. We can change the schema we are querying via the "Accept-Profile" header, when changing this to a schema that hasn't been exposed to the API or doesn't exist, we would be presented with an error message.
curl 'https://ubssibzsnwtwbbvbhbvb.supabase.co/rest/v1/bulletins?select=*' \
-H 'accept-profile: other_schema' \
-H "apikey: $ANON_KEY" \
-H "authorization: Bearer $ANON_KEY"
"The schema must be one of the following: public, information_schema, storage"
It would appear that the information_schema schema is exposed on this project. Information schemas, you should be aware, are a really bad thing to expose publicly. This is where all of the information about our database can be queried, including tables, and roles.
Lets check out the tables table in the information schema, specifically the tables in the public schema. To do this we modify our cURL command like this, noting the changes to the accept-profile header, and the parameters in the url:
curl 'https://ubssibzsnwtwbbvbhbvb.supabase.co/rest/v1/tables?select=table_name,table_schema&table_schema=eq.public' \
-H 'accept-profile: information_schema' \
-H "apikey: $ANON_KEY" \
-H "authorization: Bearer $ANON_KEY"
This returns a JSON array that containing all the table names in the public schema:
[{"table_name": "bulletins","table_schema":"public"}, {"table_name":"staff","table_schema":"public"}, {"table_name":"staff_only","table_schema":"public"}]
The bulletins table we already knew about, but staff and staff_only are new to us. The staff_only table will be relevant for the next flag, here we are interested in the staff table.
We are almost there, and now that we understand how to query the schema with our cURL commands, its as simple as updating our request to query the staff table in the public schema:
curl 'https://ubssibzsnwtwbbvbhbvb.supabase.co/rest/v1/staff' \
-H 'accept-profile: public' \
-H "apikey: $ANON_KEY" \
-H "authorization: Bearer $ANON_KEY"
This will return the full contents of the staff table, which includes our first intermediate flag!
This was the last flag to be found. Despite the clues I had placed throughout the site, no one chose to try to brute force the password of herbert.west@miskatonicuniversity.us. This can be attributed to my failure to anticipate the honesty of our hackers who fairly thought that a dictionary attack would have been considered a Denial of Service technique.
Despite the clue we left on social media, finding this flag was easier said then done, lets use Burp Suite's Intruder tool for this demonstration. Sorry Burp Suite Community Edition users!
Using Burps inbuilt browser (the "Open Browser" button in the "Proxy", "Intercept" menu), navigate to /sign-in, enter herbert.west@miskatonicuniversity.us as the email, and anything as the password, press the Sign in button.
Under Burp Suites "Proxy" "HTTP History" panel, find the POST request of this form. Your headers may look different based on your environment.
Right click this and "Send to Intruder". In the Intruder panel. Find the password you entered and highlight the whole text. Click the "Add $" button to add the field as your Intruder target.
If you are a dinosaur like me, you would load a password file like "Rockyou.txt".
Supabase Auth has a rate limit for sign in requests based on certain heuristics so we need to throttle our requests to about one request per 2000 milliseconds. You can do this in the "Resource Pool" section, create a new resource pool with maximum concurrent requests of 1 and delay between requests of 2000 milliseconds. From here on it is just a waiting game for Intruder to return us a valid result.
Should I have included a password higher up on most wordlists? Yes, yes I should have.
Once you have found the right password (Password123), it's not over yet. Logging in with these credentials will only show you the old client side validation flag.
Remember the staff_only table we discovered in the "open schema" flag previously? Well now is our chance to exploit this. The last thing we need to do is leverage our staff authorisation to query this table.
In your browser with DevTools open and while logged in as Herbert West refresh the home page to get the request to bulletins we used in the "scoped bulletins" flag above. In the Network panel, right click and "copy as curl". In the command line, modify the path from /bulletins?select=*&date=gt.2001-01-01 to /staff_only this will return our second intermediate flag.
While not the last flag to be captured, this was the least captured flag until [this tweet](https://x.com/supabase/status/1866750
Technical deep dive into the new DBOS integration for Supabase
Use any OpenAI API compatible LLMs in database.build
06 Dec 2024 · 2 minute read
We are announcing our first-ever Capture the Flag challenge, Hack the Base. Whether you're a seasoned hacker or just starting out, this challenge is for you.
A Capture the Flag (CTF) is a cybersecurity competition where participants solve a series of challenges to earn points. In this case, you'll be tasked with finding flags in our education partners news site, picking apart how it works and the secrets it may be hiding.
We've designed challenges for both beginner and advanced hackers. Get on the board with the easy challenges, and then move on to the more difficult ones. There will be no hints to start with, but we'll be releasing hints via social media throughout the week to help you along the way.
For detailed rules and to submit your flags, visit our dedicated CTF website. You'll also be able to track your progress on the leaderboard and see how you stack up against other participants. The challenge will kick off on December 8th 1pm PT and run until December 14th 1pm PT, be there or be square!
If you're interested in finding vulnerabilities in Supabase year-round, check out our Bug Bounty Program. We're always looking for talented security researchers to help us improve the security of our platform.
We can't wait to see what you can do! Good luck, and happy hacking!
06 Dec 2024 · 3 minute read
Here are the top 10 launches from the past week. You won't believe #7.
Supabase Cron is a new Postgres module for creating and managing recurring tasks. It is deeply integrated with the rest of the Supabase toolset.
We overhauled the AI Assistant in the Dashboard and gave it a bunch more "abilities": from writing Postgres Functions to creating entire databases.
A feature tri-fecta: Edge Functions now support Websockets for things like OpenAI's Realtime API, ephemeral storage for things like zip files, and Background Tasks that continue to run after you've sent a response to your users.
We released v2 of the CLI, adding support for Configuration as Code. You can now use the CLI in your GitHub actions to keep all of your project configuration in sync.
With advanced disks you can store up to 60 TB of data with 100x improved durability, and provision up to 5x more IOPS than the default disks we offer.
You can now launch new projects from the backups of any of your existing projects. This is particularly helpful if you want to do some data engineering without impacting your production database.
"postgres.new" is now "database.build" and you can still run everything in your own browser. We've added 3 major features: Bring-your-own-LLM, Live Share, and the ability to deploy your pglite to Supabase.
We're building a new storage engine for Postgres that's much faster than the current storage engine. We've released some benchmarks and made it available on the Platform.
A Postgres-native, durable Message Queue with guaranteed delivery, improving the scalability and resiliency of your applications.
This Launch Week is special because we didn't do it alone. This time, more than 20 other devtools joined us to launch their products and features throughout the week. Some things are just more fun when you do them together.
Want to test hacking skills? We're running a Capture the Flag event called Hack the Base. It's a free event that anyone can participate in.
We're running a virtual hackathon starting now where you can win prizes for building the best projects using Supabase. We'll be announcing the winners on December 15th.
Effortlessly Clone Data into a New Supabase Project
Store up to 60 TB of data with 100x improved durability and 5x more IOPS
Durable Message Queues with Guaranteed Delivery in Postgres
Release Date: 04 Dec 2024
Today we're releasing Supabase Cron, a new Postgres Module that makes recurring Jobs simple and intuitive inside your database.
It's designed to work seamlessly with the entire Supabase platform. Create recurring Jobs to run SQL snippets and call database functions, Supabase Edge Functions, and even remote webhooks.
Supabase Cron is built on the powerful pg_cron extension by the team at Citus Data.
Cron is a tool for scheduling recurring tasks that run at specified intervals. These periodic tasks are called "Cron Jobs". Common use-cases include:
Supabase Cron stores the scheduling logic within Postgres and integrates with the rest of the Supabase primitives - Dashboard, Edge Functions, and AI Assistant.
You can create Jobs either via the Dashboard or SQL. Within the Dashboard you can define schedules using standard cron syntax, the special pg_cron seconds syntax for sub-minute schedules, or use natural language.
You can choose between four types of Jobs based on what you need to execute:
Create an inline SQL query or command to run on your database periodically. Use this for:
Call a Postgres function. This is useful for workflows such as:
Trigger an external HTTP endpoint. Use this for:
Run a serverless function to execute custom logic. Examples include:
View the history of all Jobs and their logs in the Dashboard. You can see when a Job started, how long it took, and what the result was.
For deeper analysis, view Jobs in the Logs Explorer.
Getting started is easy:
Commit the configuration for all of your Projects and Branches into version control.
Edge functions can be used for workloads outside the request-response lifecycle
An evolution of how we approach AI within the Supabase dashboard