Type safe SQL query builder and struct mapper for Go

GoDoc-postgres GoDoc-mysql CI Go Report Card Coverage Status

sq (Structured Query)

🎯 πŸ† sq is a code-generated, type safe query builder and struct mapper for Go. πŸ† 🎯
Documentation   β€’   Reference   β€’   Examples

This package provides type safe querying on top of Go's database/sql. It is essentially a glorified string builder, but automates things in all the right places to make working with SQL queries pleasant and boilerplate-free.

  • Avoid magic strings. SQL queries written in Go are full of magic strings: strings specified directly within application code that have an impact on the application's behavior. Specifically, you have to hardcode table or column names over and over into your queries (even ORMs are guilty of this!). Such magic strings are prone to typos and hard to change as your database schema changes. sq generates table structs from your database and ensures that whatever query you write is always reflective of what's actually in your database. more info

  • Better null handling. Handling NULLs is a bit of a pain in the ass in Go. You have to either use pointers (cannot be used in HTML templates) or sql.NullXXX structs (extra layer of indirection). sq scans NULLs as zero values, while still offering you the ability to check if the column was NULL. more info

  • The mapper function is the SELECT clause.

    • database/sql requires you to repeat the list of columns twice in the exact same order, once for SELECT-ing and once for scanning. If you mess the order up, that's an error.
    • Reflection-based mapping (struct tags) has you defining a set of possible column names to map, and then requires you repeat those columns names again in your query. If you mistype a column name in the struct tag, that's an error. If you SELECT a column that's not present in the struct, that's an error.
    • In sq whatever you SELECT is automatically mapped. This means you just have to write your query, execute it and if there were no errors, the data is already in your Go variables. No iterating rows, no specifying column scan order, no error checking three times. Write your query, run it, you're done.
    • more info

Features

Getting started

go get github.com/bokwoon95/go-structured-query

You will also need the dialect-specific code generator

# Postgres
go get github.com/bokwoon95/go-structured-query/cmd/sqgen-postgres

# MySQL
go get github.com/bokwoon95/go-structured-query/cmd/sqgen-mysql

Generate tables from your database

# for more options, check out --help

# Postgres
sqgen-postgres tables --database 'postgres://name:pass@localhost:5432/dbname?sslmode=disable' --overwrite

# MySQL
sqgen-postgres tables --database 'name:pass@tcp(127.0.0.1:3306)/dbname' --schema dbname --overwrite

For an example of what the generated file looks like, check out postgres/devlab_tables_test.go.

Importing sq

Each SQL dialect has its own sq package. Import the sq package for the dialect you are using accordingly:

// Postgres
import (
    sq "github.com/bokwoon95/go-structured-query/postgres"
)

// MySQL
import (
    sq "github.com/bokwoon95/go-structured-query/mysql"
)

Examples

You just want to see code, right? Here's some.

SELECT

-- SQL
SELECT u.user_id, u.name, u.email, u.created_at
FROM public.users AS u
WHERE u.name = 'Bob';
// Go
u := tables.USERS().As("u") // table is code generated
var user User
var users []User
err := sq.
    From(u).
    Where(u.NAME.EqString("Bob")).
    Selectx(func(row *sq.Row) {
        user.UserID = row.Int(u.USER_ID)
        user.Name = row.String(u.NAME)
        user.Email = row.String(u.EMAIL)
        user.CreatedAt = row.Time(u.CREATED_AT)
    }, func() {
        users = append(users, user)
    }).
    Fetch(db)
if err != nil {
    // handle error
}

INSERT

-- SQL
INSERT INTO public.users (name, email)
VALUES ('Bob', '[email protected]'), ('Alice', '[email protected]'), ('Eve', '[email protected]');
// Go
u := tables.USERS().As("u") // table is code generated
users := []User{
    {Name: "Bob",   Email: "[email protected]"},
    {Name: "Alice", Email: "[email protected]"},
    {Name: "Eve  ", Email: "[email protected]"},
}
rowsAffected, err := sq.
    InsertInto(u).
    Valuesx(func(col *sq.Column) {
        for _, user := range users {
            col.SetString(u.NAME, user.Name)
            col.SetString(u.EMAIL, user.Email)
        }
    }).
    Exec(db, sq.ErowsAffected)
if err != nil {
    // handle error
}

UPDATE

-- SQL
UPDATE public.users
SET name = 'Bob', password = 'qwertyuiop'
WHERE email = '[email protected]';
// Go
u := tables.USERS().As("u") // table is code generated
user := User{
    Name:     "Bob",
    Email:    "[email protected]",
    Password: "qwertyuiop",
}
rowsAffected, err := sq.
    Update(u).
    Setx(func(col *sq.Column) {
        col.SetString(u.NAME, user.Name)
        col.SetString(u.PASSWORD, user.Password)
    }).
    Where(u.EMAIL.EqString(user.Email)).
    Exec(db, sq.ErowsAffected)
if err != nil {
    // handle error
}

DELETE

-- SQL
DELETE FROM public.users AS u
USING public.user_roles AS ur
JOIN public.user_roles_students AS urs ON urs.user_role_id = ur.user_role_id
WHERE u.user_id = ur.user_id AND urs.team_id = 15;
// Go
u   := tables.USERS().As("u")                 // tables are code generated
ur  := tables.USER_ROLES().As("ur")           // tables are code generated
urs := tables.USER_ROLES_STUDENTS().As("urs") // tables are code generated
rowsAffected, err := sq.
    DeleteFrom(u).
    Using(ur).
    Join(urs, urs.USER_ROLE_ID.Eq(ur.USER_ROLE_ID)).
    Where(
        u.USER_ID.Eq(ur.USER_ID),
        urs.TEAM_ID.EqInt(15),
    ).
    Exec(db, sq.ErowsAffected)
if err != nil {
    // handle error
}

For more information, check out the Basics.

For a list of example queries, check out Query Building.

Project Status

The external API is considered stable. Any changes will only be add to the API (like support for custom loggers and structured logging). If you have any feature requests or if you find bugs do open a new issue.

Contributing

See CONTRIBUTING.md

Appendix

Why this exists

I wrote this because I needed a more convenient way to scan database rows into nested structs, some of which exist twice in the same struct due to self joined tables. That made sqlx's StructScan unsuitable (e.g. cannot handle type Child struct { Father Person; Mother Person; }). database/sql's way of scanning is really verbose especially since I had about ~25 fields to scan into, some of which could be null. That's a lot of sql Null structs needed! Because I had opted to -not- pollute my domain structs with sql.NullInt64/ sql.NullString etc, I had to create a ton of intermediate Null structs just to contain the possible null fields, then transfer their zero value back into the domain struct. There had to be a better way. I just wanted their zero values, since everything in Go accomodates the zero value.

sq is therefore a data mapper first, and query builder second. I try my best to make the query builder as faithful to SQL as possible, but the main reason for its existence was always the struct mapping.

The case for ALL_CAPS

Here are the reasons why ALL_CAPS is used for table and column names over the idiomatic MixedCaps:

  1. jOOQ does it.
  2. It's SQL. It's fine if it doesn't follow Go convention, because it isn't Go.
    • Go requires exported fields by capitalized.
    • SQL, being case insensitive, generally uses underscores as word delimiters.
    • ALL_CAPS is a blend that satisfies both Go's export rules and SQL's naming conventions.
    • In my opinion, it is also easier to read because table and column names visually stand out from application code.
  3. Avoids clashing with interface methods. For a struct to fit the Table interface, it has to possess the methods GetAlias() and GetName(). This means that no columns can be called 'GetAlias' or 'GetName' because it would clash with the interface methods. This is sidestepped by following an entirely different naming scheme for columns i.e. ALL_CAPS.

On SQL Type Safety

sq makes no effort to check the semantics of your SQL queries at runtime. Any type checking is entirely enforced by what methods that you can call and argument types that you can pass to these methods. For example, You can call Asc()/Desc() and NullsFirst()/NullsLast() on any selected field and it would pass the type checker, because Asc()/Desc()/NullsFirst()/NullsLast() still return a Field interface:

u := tables.USERS().As("u")
sq.Select(u.USER_ID, u.USERNAME.Asc().NullsLast()).From(u)

which would translate to

SELECT u.user_id, u.username ASC NULLS LAST FROM users AS u
-- obviously wrong, you can't use ASC NULLS LAST inside the SELECT clause

The above example passes the Go type checker, so sq will happily build the query string -- even if that SQL query is sematically wrong. In practice, as long as you aren't trying to actively do the wrong thing (like in the above example), the limited type safety will prevent you from making the most common types of errors.

It also means the query builder will never fail: there's no boilerplate error checking required. Any semantic errors will be deferred to the database to point it out to you.

Dialect agnostic query builder?

sq is not dialect agnostic. This means I can add your favorite dialect specific SQL features without the headache of cross-dialect compatibility. It also makes contributions easier, as you just have to focus on your own SQL dialect and not care about the others.

Comments
  • Add support for postgres UUID fields

    Add support for postgres UUID fields

    Adds support for the uuid postgres type, using the github.com/google/uuid package for storing UUIDs. I think in the future, it might be worth adding in configuration in sqgen-postgres to generate uuid fields as BinaryFields or StringFields (since they can be scanned into either data type just fine), but I'm keeping the initial PR small.

    Unfortunately, database/sql doesn't support a NullUUID type like it does for NullString, etc., so I've used the uuid.Nil (which is just a [16]byte with all 0's) to represent an "null" uuid type. I'm happy to change the implementation of null UUIDs if there's a better way to do this.

    I've added a few test cases, and populated the media table in the postgres devlab database so we can query for a deterministic UUID in the tests.

    If this PR gets accepted, I can also add the implementation for mysql and the accompanying documentation, just wanted to small start at first.

    I also moved the txdb registration into a func TestMain(t *testing.M) {} block - from what I can tell, I think that's a more common practice for test setup than init functions.

  • Allow the `Subquery` struct to satisfy the `Field` interface

    Allow the `Subquery` struct to satisfy the `Field` interface

    It would be useful if the Subquery type could satisfy the Field interface, so that it could be used to embed subqueries in a select clause, without needing to add it to the from clause.

    e.g. it would make it possible to write the following query (pulled from a project I'm working on)

    SELECT
    	matches.id,
    	matches.date_created,
    	matches.num_players,
    	(select jsonb_agg(
    		jsonb_build_object(
    			'ID', match_users.id,
    			'MatchID', match_users.match_id,
    			'UserID', match_users.user_id,
    			'Order', match_users.turn_order,
    			'IsCreator', match_users.is_creator,
    			'IsWinner', match_users.is_winner
    		))
    		from public.match_users where match_users.match_id = matches.id
    	) as "match_users"
    	FROM public.matches
            WHERE matches.id = $1;
    

    Is there another way of doing this that I'm not aware of with this library? I don't mind reverting to plain SQL for cases like these, it would just be a nice addition to the query builder.

  •  cannot convert symbolIDs[

    cannot convert symbolIDs["id"] (map index expression of type sq.CustomField) to sq.NumberField

    Have got a sq.CustomField that's the id of a table, and I'm not allowed to convert it or compare it to the id field of another table.

    Is there an easy way to convert a value, or do I need to (somehow) map all the fields in the CustomField to a NumberField mid-query?

  • Missing functionality on ArrayField

    Missing functionality on ArrayField

    This PR adds following functionality for ArrayField:

    1. SetTo() method for setting value of ArrayField to another field in the query. I had to work around this in the upsert example by explicitly creating FieldAssignment.
    2. Do not expand slices that implement the database/sql/driver:Valuer interface. I had to work around this by implementing SQLAppender interface in my code for every enum Array.
  • How to specify select query as one of the select fields?

    How to specify select query as one of the select fields?

    With following two tables:

    users (id, name)
    likes (id, count, user_id)
    

    Following SQL query can be created:

    select *, (select count from likes where user_id = users.id) as like_count from users;
    

    The reason for putting subquery in the field is that it can be selectively added to func(*sq.Row) function passed on to Selectx. In my code I have a very complex subquery like this. I am using the following go code:

    type User struct {
      ID uuid.UUID
      Name string
      Likes *int
    }
    
    func (u *User) ScanRow(t tables.USERS, fetchLikes bool) func(*sq.Row) {
      return func(row *sq.Row) {
        u.ID = row.UUID(t.ID)
        u.Name = row.String(t.NAME)
        if fetchLikes {
          row.ScanInto(&u.Likes, sq.Fieldf("(select count from likes where user_id = ?)", t.ID).As("like_count"))
        }
      }
    }
    

    Can you please help me in removing the hard-coded SQL?

  • Refactor sqgen-* to standalone library (decoupled from cobra), add lots of tests

    Refactor sqgen-* to standalone library (decoupled from cobra), add lots of tests

    For one of my own projects, I wanted to directly call the sqgen-postgres logic, rather than invoke it through a shell command. To do this, I extracted almost all of the sqgen-postgres and sqgen-mysql into the respective sqgen/postgres or sqgen/mysql packages, with a shared sqgen package for simple shared utilities.

    As a result, the process of code generation is no longer tied to cobra - the cmd/sqgen-postgres/ and cmd/sqgen-mysql/ packages are now just a thin wrapper around the code generation library in the sqgen/* package.

    I also did a decent amount of refactoring within the logic for each of the code generators so that it was a bit easier to test. The few major changes I made were:

    1. No more custom String type, as the QuoteSpace and Export functions were moved into a template.FuncMap and injected into the template - this means that the fields that previously used String can use the string type instead, which simplifies a decent amount of the generation code
    2. The logic for populating the fields of a Table/Function/Field is now a method (called Populate) on the given object. If it needs information about the rest of the functions/tables (i.e. is it a duplicate, is it an overload, etc.), these are passed into the Populate method as arguments - this made the testing for various edge cases a bit simpler.
    3. The generated code is written to an io.Writer, so a writer of any sort can be written to. This is also used for dry-run generation, as os.Stdout is used as the writer, instead of writing to a file.
    4. goimports is added as a dependency, and is used as a library (instead of through a shell command)
    5. For the sqgen-postgres functions command, the pg_catalog.pg_proc.prokind column is not available for Postgres versions lower than 11. I added a check to see which version of Postgres is being generated from (see the queryPgVersion and checkProkindSupport functions in sqgen/postgres/functions.go), and modify the query to use the legacy pg_proc columns to essentially do the same thing as p.prokind = 'f'.

    Of course this is a stupid number of changes to add all at once, so I've aimed to test as much of the code as possible, both through unit and integration tests.

    In the integration_test.go files, I check the output of the library against the output of sqgen-postgres and sqgen-mysql against the test databases (using the testate) prior to any of the changes that I've made, and there are no differences in the output.

    Please let me know if there is anything you'd like me to change/test more, etc. before accepting the PR! I know that this is a large amount of code to change, but the core logic is pretty much the same as before, just structured a bit differently, and with a bunch of tests added.

  • Q/A Where field equal subquery?

    Q/A Where field equal subquery?

    Hello, i can't find in the docs, how i can make query like this: select name from categories where uuid = (select parent_uuid from categories where uuid = 'd45f1df9-169b-11e8-ba44-d8cb8abf61a1') i find desicion by use WhereIn but question made is not the same. Sorry. i try WhereIn and final query equal with what i wanted

  • Multiple tables in a query

    Multiple tables in a query

    Hi - I have a query that would be achieved in SQL like so

    SELECT foo.A bar.B FROM A, B WHERE some predicate
    

    I don't want to do a join (if I do it's a multi table join), and the second table's field is only used for an inner query

    I cannot see a way to do that with this tool, have I missed something?

    I've found that I can pass nil as a predicate - no idea what effect this will have on the query (yet, am testing)

  • fix: sqgen-mysql has no commands

    fix: sqgen-mysql has no commands

    sqgen-mysql was doing nothing (except printing Code generation for the sq package), because the tables command was never actually added.

    Unrelatedly, I think this is a really interesting take on a golang query builder, and I really like the idea! The docs are exceptionally good too, awesome work there.

  • add cobra flag to allow tables to be excluded from generation

    add cobra flag to allow tables to be excluded from generation

    I'm using the golang-migrate tool in order to create migrations for my tables, and that requires that there's a table called "schema_migrations" added to my database schema. I didn't want it to be included in the list of tables generated by this library, so I added the flags to support that.

    The filtering is done at the sql-level, so that all downstream commands don't have to know about any of the excluded tables.

    I was also playing around with having the exclude pattern be case-sensitive, but from what I can tell the table names in this library are case-insensitive anyways (for postgres, at least), so I decided not to add in case-sensitivity.

    I also split out part of the sql generation into separate commands so that they could be more easily tested, if you require that for the PR.

    Thanks for the great library! The DX is way better than a lot of the other alternatives.

  • data mapping for pointer fields will overwrite every item in the slice

    data mapping for pointer fields will overwrite every item in the slice

    sq's multi-row struct mapping works by repeatedly binding data to a single variable, then appending that variable to a slice. This works if the variable is a value type, since it will be copied by value every append: subsequent modifications to that variable will not affect what has already been appended into the slice. However if there were any fields that are pointer types, subsequent modifications to those fields will overwrite them for everything that has already been appended to the slice (since they were copied by reference).

    There are workarounds for this (manually copying data before append), but it's not very user friendly as it requires understanding the inner workings of the library in order to fully grok.

    I'm honestly not sure if there's a way around it, since the current way is the only way to have a fully 'generic' mapper function that works for any data type. By encapsulating stateful modifications in a closure function (or a pointer receiver method), the function for all mapper functions can have the same type of func (*sq.Row).

    Even Go generics may not be sufficient here as they do not permit type parameters on methods. The following code below is ideal, but will not be achievable with Go's generics:

    func (q SelectQuery) Get[Item any](db *DB, mapper func(*Row) Item) (Item, error)
    func (q SelectQuery) GetSlice[Item any](db *DB, mapper func(*Row) Item) ([]Item, error)
    func (q SelectQuery) GetAccumulate[Item, Items any](db *DB, initial Items, mapper func(*Row) Item, accumulator func(Item, Items) Items) (Items, error)
    

    The document states that 'a suitably parameterized top-level function' will have to be written. Perhaps that may be the workaround.

    func Get[Item any](db *DB, q Query, mapper func(*Row) Item) (Item, error)
    func GetSlice[Item any](db *DB, q Query, mapper func(*Row) Item) ([]Item, error)
    

    However the invocation will be so clunky compared to regular Selectx that I am not so sure that it is a direct upgrade:

    // using Selectx
    var user User
    var user []User
    err := sq.
        From(u).
        Where(u.Name.EqString("bob")).
        Selectx(user.RowMapper, func() { users = append(users, user) }).
        Fetch(db)
    if err != nil {
    }
    fmt.Println(users)
    
    // using GetSlice
    users, err := sq.GetSlice(db,
        sq.From(u).
            Where(u.NAME.EqString("bob")),
        User{}.RowMapper,
    if err != nil {
    }
    fmt.Println(users)
    )
    

    The only value Get brings over Selectx is that the mapper function can return a new value everytime rather than doing some side effect on a pointer receiver. My main issue with this is that it solves the problem of overwriting pointers fields but with an entirely different way of doing things (converting the query from a method to a top level function).

  • Suggestion: Support Table/Field creation from Table structure

    Suggestion: Support Table/Field creation from Table structure

    When going to production you need to ensure the database is the same structure. It would make sense that we can run a function to create any missing tables/fields based on the structure generated

    Just a suggestion.

SQL builder and query library for golang

__ _ ___ __ _ _ _ / _` |/ _ \ / _` | | | | | (_| | (_) | (_| | |_| | \__, |\___/ \__, |\__,_| |___/ |_| goqu is an expressive SQL bu

Dec 30, 2022
SQL query builder for Go

GoSQL Query builder with some handy utility functions. Documentation For full documentation see the pkg.go.dev or GitBook. Examples // Open database a

Dec 12, 2022
Fast SQL query builder for Go

sqlf A fast SQL query builder for Go. sqlf statement builder provides a way to: Combine SQL statements from fragments of raw SQL and arguments that ma

Dec 23, 2022
gosq is a parsing engine for a simplicity-focused, template-based SQL query builder for Go.

gosq is a parsing engine for a simplicity-focused, template-based SQL query builder for Go.

Oct 24, 2022
Generate type safe Go from SQL

sqlc: A SQL Compiler sqlc generates type-safe code from SQL. Here's how it works: You write queries in SQL.

Dec 31, 2022
Data-builder - Data builder with golang

databuilder import "github.com/go-coldbrew/data-builder" Index Variables func Is

Feb 5, 2022
sqlc implements a Dynamic Query Builder for SQLC and more specifically MySQL queries.

sqlc-go-builder sqlc implements a Dynamic Query Builder for SQLC and more specifically MySQL queries. It implements a parser using vitess-go-sqlparser

May 9, 2023
Go database query builder library for PostgreSQL

buildsqlx Go Database query builder library Installation Selects, Ordering, Limit & Offset GroupBy / Having Where, AndWhere, OrWhere clauses WhereIn /

Dec 23, 2022
Simple query builder for MongoDB

?? greenleaf - simple, type safe and easy to use query builder for MongoDB Installation To install use: go get github.com/slavabobik/greenleaf Quick

Nov 27, 2022
Database Abstraction Layer (dbal) for Go. Support SQL builder and get result easily (now only support mysql)

godbal Database Abstraction Layer (dbal) for go (now only support mysql) Motivation I wanted a DBAL that No ORM、No Reflect、Concurrency Save, support S

Nov 17, 2022
golang orm and sql builder

gosql gosql is a easy ORM library for Golang. Style: var userList []UserModel err := db.FetchAll(&userList, gosql.Columns("id","name"), gosql.

Dec 22, 2022
a golang library for sql builder

Gendry gendry is a Go library that helps you operate database. Based on go-sql-driver/mysql, it provides a series of simple but useful tools to prepar

Dec 26, 2022
An easy-use SQL builder.

EQL An easy-use SQL builder. Design We are not English native speaker, so we use Chinese to write the design documents. We plan to translate them to E

Dec 26, 2022
A Go (golang) package that enhances the standard database/sql package by providing powerful data retrieval methods as well as DB-agnostic query building capabilities.

ozzo-dbx Summary Description Requirements Installation Supported Databases Getting Started Connecting to Database Executing Queries Binding Parameters

Dec 31, 2022
Query git repositories with SQL. Generate reports, perform status checks, analyze codebases. πŸ” πŸ“Š

askgit askgit is a command-line tool for running SQL queries on git repositories. It's meant for ad-hoc querying of git repositories on disk through a

Jan 5, 2023
SQL query helper

SQL query helper

Nov 7, 2021
Command line tool to generate idiomatic Go code for SQL databases supporting PostgreSQL, MySQL, SQLite, Oracle, and Microsoft SQL Server

About xo xo is a command-line tool to generate Go code based on a database schema or a custom query. xo works by using database metadata and SQL intro

Jan 8, 2023
Go fearless SQL. Sqlvet performs static analysis on raw SQL queries in your Go code base.

Sqlvet Sqlvet performs static analysis on raw SQL queries in your Go code base to surface potential runtime errors at build time. Feature highlights:

Dec 19, 2022
PirateBuilder - Pirate Builder For Golang
PirateBuilder - Pirate Builder For Golang

PirateBuilder Builder You need to extract the file "PirateBuilder.rar". Start "P

Jun 10, 2022