Skip to main content
Updated Feb 10, 2026

Connecting to Neon

In L5, you built transactions that keep your Budget Tracker data consistent. Transfers succeed completely or fail completely. Your database is safe from corruption.

But there's a problem: Your database dies when you restart Python. SQLite in-memory mode means all your expense data vanishes every time the program stops. You can't share your Budget Tracker with friends. You can't access it from your phone.

The solution: Move from local SQLite to a cloud database. Neon gives you a PostgreSQL database that runs 24/7, auto-scales with traffic, and costs nothing for learning.

What is Neon?

Neon is a serverless PostgreSQL database. Serverless means you don't manage servers. Neon handles scaling, backups, and availability automatically.

FeatureSQLite (what you have now)Neon (what you're getting)
LocationLocal file or memoryCloud (always available)
UsersSingle process onlyMultiple users simultaneously
PersistenceDies on restartAlways running
BackupsNone (manual only)Automatic
ScalingFixedAuto-scales with traffic
Cost$0$0 (free tier)

Neon also offers features you don't need yet but will appreciate later:

  • Auto-pause: Scales to zero when idle (no cost when not in use)
  • Database branching: Create copies of your database like Git branches
  • Read replicas: Scale read-heavy workloads

For learning: Neon's free tier includes up to 100 projects, 0.5 GB storage per project, and shared compute. More than enough for learning.

Step 1: Create Your Neon Account

  1. Go to neon.tech
  2. Sign up (GitHub login is fastest)
  3. Create a new project:
    • Name: budget-tracker (or anything you remember)
    • Region: Choose closest to you (lower latency)
    • Database: Keep default neondb
  4. Wait about 30 seconds for provisioning

Your project is ready when you see the dashboard.

Step 2: Get Your Connection String

In the Neon dashboard:

  1. Click your project
  2. Go to the Connection Details section
  3. Find the connection string selector
  4. Select: Python and psycopg2
  5. Copy the connection string

It looks like this:

postgresql+psycopg2://alice:secretpass123@ep-cool-breeze-123456.us-east-2.aws.neon.tech/neondb?sslmode=require

Let's break down what each part means:

postgresql+psycopg2://alice:secretpass123@ep-cool-breeze-123456.us-east-2.aws.neon.tech/neondb?sslmode=require
│ │ │ │ │ │
│ │ │ │ │ └─ SSL required (Neon enforces encryption)
│ │ │ │ └─ Database name
│ │ │ └─ Host (your Neon endpoint)
│ │ └─ Password (keep this secret!)
│ └─ Username
└─ Driver (psycopg2 = PostgreSQL driver for Python)

Important: This string contains your password. Treat it like a password.

Step 3: Store Credentials Securely

Never put passwords directly in code. Use environment variables.

Install python-dotenv (if not already installed):

uv add python-dotenv psycopg2-binary

Or with pip:

pip install python-dotenv psycopg2-binary

Create .env file in your project root:

DATABASE_URL=postgresql+psycopg2://alice:secretpass123@ep-cool-breeze-123456.us-east-2.aws.neon.tech/neondb?sslmode=require

Load in Python:

import os
from dotenv import load_dotenv

load_dotenv() # Read .env file

database_url = os.getenv("DATABASE_URL")
if not database_url:
raise ValueError("DATABASE_URL not set in .env file")

print("Database URL loaded successfully")

Output:

Database URL loaded successfully

Add .env to .gitignore (critical for security):

echo ".env" >> .gitignore

Verify it's ignored:

git status
# .env should NOT appear in "Untracked files"

Why this matters: If you commit .env to GitHub, anyone can access your database. Bots scan public repos for exposed credentials. Don't be that person.

Step 4: Configure Connection Pooling

Neon limits concurrent connections. Without pooling, every query opens a new connection (slow, hits limits fast). With pooling, you reuse connections (fast, respects limits).

Here's the production-ready engine configuration:

from sqlalchemy import create_engine
from sqlalchemy.pool import QueuePool
import os
from dotenv import load_dotenv

load_dotenv()

DATABASE_URL = os.getenv("DATABASE_URL")
if not DATABASE_URL:
raise ValueError("DATABASE_URL not set in .env file")

engine = create_engine(
DATABASE_URL,
poolclass=QueuePool,
pool_size=5, # Keep 5 connections ready
max_overflow=10, # Allow 10 more during traffic spikes
pool_pre_ping=True, # Test connection before using it
pool_recycle=3600, # Recreate connections after 1 hour
)

What each parameter does:

ParameterValuePurpose
pool_size5Warm connections always ready (no wait time)
max_overflow10Extra connections during high traffic (temporary)
pool_pre_pingTrueVerify connection works before using it
pool_recycle3600Refresh stale connections (Neon pauses idle connections)

Total maximum connections: pool_size + max_overflow = 15

Why pool_pre_ping=True is critical: Neon auto-pauses idle databases after 5 minutes. When you reconnect, stale connections fail. pool_pre_ping tests each connection before use and automatically replaces dead ones.

Step 5: Test Your Connection

Before deploying your models, verify the connection works:

from sqlalchemy import text

try:
with engine.connect() as conn:
result = conn.execute(text("SELECT 1"))
print("Connection successful!")
except Exception as e:
print(f"Connection failed: {e}")

Output (success):

Connection successful!

Output (common failures):

Connection failed: could not connect to server: Connection timed out
Connection failed: FATAL: password authentication failed for user "alice"
Connection failed: No module named 'psycopg2'

If you see errors, jump to the Troubleshooting section below.

Step 6: Deploy Your Models

Once connected, create your tables in Neon:

from sqlalchemy.orm import declarative_base

Base = declarative_base()

# ... your User, Category, Expense models from previous lessons ...

# Create all tables in Neon
Base.metadata.create_all(engine)
print("Tables created in Neon!")

Output:

Tables created in Neon!

Verify in Neon dashboard:

  1. Go to your project
  2. Click SQL Editor
  3. Run:
SELECT table_name FROM information_schema.tables WHERE table_schema = 'public';

Output:

table_name
----------
users
categories
expenses

Your Budget Tracker now runs on a real cloud database.

Troubleshooting Common Errors

Error: could not connect to server

Cause: Connection string is wrong, or network blocks the connection.

Fix:

  1. Go to Neon dashboard
  2. Copy connection string again (ensure you selected psycopg2)
  3. Update your .env file
  4. Try from a different network (some corporate firewalls block database ports)

Error: No module named 'psycopg2'

Cause: PostgreSQL driver not installed.

Fix:

uv add psycopg2-binary
# OR
pip install psycopg2-binary

Error: FATAL: password authentication failed

Cause: Password in connection string is wrong.

Fix:

  1. Go to Neon dashboard
  2. Click "Reset password" for your database user
  3. Copy the new connection string
  4. Update .env

Error: server closed the connection unexpectedly

Cause: Connection went stale (Neon paused your database).

Fix: Ensure pool_pre_ping=True in your engine configuration. This detects dead connections and replaces them automatically.

Error: remaining connection slots are reserved

Cause: Too many open connections (hit Neon's limit).

Fix:

  1. Reduce pool_size (try 3 instead of 5)
  2. Ensure you're closing sessions properly (with Session(engine) as session:)
  3. Check for connection leaks (sessions opened but never closed)

Working With AI on Connection Issues

You know what error message you're seeing. AI helps diagnose and fix.

Your request: "I'm getting this error connecting to Neon: psycopg2.OperationalError: could not connect to server: Connection timed out My .env looks correct. What should I check?"

AI suggests: "Let's diagnose step by step:

  1. Can you ping the host? Run: ping ep-cool-breeze-123456.us-east-2.aws.neon.tech
  2. Is your firewall blocking port 5432?
  3. Try the connection from a different network (phone hotspot)
  4. Check if the Neon project is paused (go to dashboard, project should show 'Active')"

Your refinement: "Ping works. Dashboard shows Active. But I'm on corporate VPN."

AI adapts: "Corporate VPNs often block non-HTTP traffic. Try:

  1. Disconnect from VPN temporarily
  2. Use Neon's connection pooling endpoint (port 443 instead of 5432)
  3. Ask IT to whitelist Neon's IP ranges"

What emerged: A systematic troubleshooting approach that isolated the corporate VPN as the issue. Neither of you assumed the answer upfront; you narrowed it down through iteration.

What Happens Next

Your Budget Tracker database is now running on Neon PostgreSQL. It persists forever. It scales automatically. It handles multiple concurrent users. It's production-ready from day one.

But so far, you've been building isolated pieces: models in L2, CRUD operations in L3, relationships in L4, transactions in L5, Neon setup in L6. Next, you'll learn when to combine these SQL tools with bash verification for production reliability, then put everything together in the capstone.

LessonWhat You LearnWhat You Build
L6 (now)Connect to Neon, configure poolingProduction-ready database
L7Combine SQL + bash hybrid patternsTool choice framework
L8Integrate all L0-L7 patternsComplete Budget Tracker app

L7 teaches the hybrid verification pattern — when to combine SQL with bash for self-checking data pipelines. Then the capstone lesson shows how all pieces work together into one coherent application.

Try With AI

Prompt 1: Parse Connection String

What you're learning: Understanding the components of database URLs.

Given this connection string:
postgresql+psycopg2://alice:Pass123@ep-main-789.us-west-1.aws.neon.tech/mybudget?sslmode=require

Answer these questions:
1. What's the username?
2. What's the password? (hint: don't share this in real projects)
3. What's the database host?
4. What's the database name?
5. Why is sslmode=require at the end?
6. What does postgresql+psycopg2 mean?

For each answer, explain why that component matters.

After AI explains, verify: Can you identify each part in YOUR connection string?

Prompt 2: Deploy Budget Tracker to Neon

What you're learning: Real cloud deployment workflow.

Help me complete these steps to deploy my Budget Tracker to Neon:

1. I've created a Neon account and project
2. I have my connection string
3. I need to:
- Create .env file with DATABASE_URL
- Add .env to .gitignore
- Install psycopg2-binary
- Update my engine with connection pooling
- Test connection with SELECT 1
- Run Base.metadata.create_all(engine)
- Verify tables exist in Neon dashboard

Give me the exact commands and code for each step.
After each step, tell me how to verify it worked.

After completing, verify: Are your three tables (users, categories, expenses) visible in Neon's SQL Editor?

Prompt 3: Document Deployment Patterns

What you're learning: Creating reusable deployment skills.

Add to my /database-deployment skill:

## Neon PostgreSQL Setup

Include:
1. Connection string anatomy (driver, user, pass, host, db, ssl)
2. Environment variable pattern (.env + dotenv + .gitignore)
3. Connection pooling configuration (with parameter explanations)
4. Troubleshooting guide (4 common errors and fixes)

Format as markdown. Make it reusable for ANY Neon project.
Use [YOUR_CONNECTION_STRING] as placeholder, not my actual credentials.

After AI responds, check: Could someone on your team use this skill to set up a new Neon database?

Security reminder: Never commit .env files. Never share connection strings in chat logs, screenshots, or code reviews. Rotate passwords if you accidentally expose them.

Checkpoint

Before moving to L7 (Hybrid Patterns):

  • Neon account created with budget-tracker project
  • Connection string copied from Neon dashboard
  • .env file created with DATABASE_URL
  • .env added to .gitignore (verified with git status)
  • psycopg2-binary installed
  • Engine configured with connection pooling (5 parameters)
  • Connection tested (SELECT 1 succeeds)
  • Tables created in Neon (Base.metadata.create_all(engine))
  • Tables verified in Neon SQL Editor
  • Documented patterns in /database-deployment skill

Your Budget Tracker now persists data forever in the cloud. Ready for the capstone.