a golang library for sql builder

Gendry

Build Status Gitter Hex.pm GoDoc

gendry is a Go library that helps you operate database. Based on go-sql-driver/mysql, it provides a series of simple but useful tools to prepare parameters for calling methods in standard library database/sql.

The name gendry comes from the role in the hottest drama The Game of Throne, in which Gendry is not only the bastardy of the late king Robert Baratheon but also a skilled blacksmith. Like the one in drama, this library also forge something which is called SQL.

gendry consists of three isolated parts, and you can use each one of them partially:

Translation

Manager

manager is used for initializing database connection pool(i.e sql.DB), you can set almost all parameters for those mysql driver supported.For example, initializing a database connection pool:

var db *sql.DB
var err error
db, err = manager
		.New(dbName, user, password, host)
		.Set(
			manager.SetCharset("utf8"),
			manager.SetAllowCleartextPasswords(true),
			manager.SetInterpolateParams(true),
			manager.SetTimeout(1 * time.Second),
			manager.SetReadTimeout(1 * time.Second)
		).Port(3302).Open(true)

In fact, all things manager does is just for concatting the dataSourceName

the format of a dataSourceName is:

[username[:password]@][protocol[(address)]]/dbname[?param1=value1&...&paramN=valueN]

manager is based on go-mysql-driver/mysql, and if you don't know some of the manager.SetXXX series functions, see it on mysql driver's github home page.And for more details see manager's doc

Builder

builder as its name says, is for building sql. Writing sql manually is intuitive but somewhat difficult to maintain.And for `where in`, if you have huge amount of elements in the `in` set, it's very hard to write.

builder isn't an ORM, in fact one of the most important reasons we create Gendry is we don't like ORM. So Gendry just provides some simple APIs to help you building sqls:

where := map[string]interface{}{
	"city": []string{"beijing", "shanghai"},
	// The in operator can be omitted by default,
	// which is equivalent to:
	// "city in": []string{"beijing", "shanghai"},
	"score": 5,
	"age >": 35,
	"address": builder.IsNotNull,
	"_or": []map[string]interface{}{
		{
			"x1":    11,
			"x2 >=": 45,
		},
		{
			"x3":    "234",
			"x4 <>": "tx2",
		},
	},
	"_orderby": "bonus desc",
	"_groupby": "department",
}
table := "some_table"
selectFields := []string{"name", "age", "sex"}
cond, values, err := builder.BuildSelect(table, where, selectFields)

//cond = SELECT name,age,sex FROM some_table WHERE (((x1=? AND x2>=?) OR (x3=? AND x4!=?)) AND score=? AND city IN (?,?) AND age>? AND address IS NOT NULL) GROUP BY department ORDER BY bonus DESC
//values = []interface{}{11, 45, "234", "tx2", 5, "beijing", "shanghai", 35}

rows, err := db.Query(cond, values...)

In the where param, automatically add 'in' operator by value type(reflect.Slice).

where := map[string]interface{}{
	"city": []string{"beijing", "shanghai"},
}

the same as

where := map[string]interface{}{
	"city in": []string{"beijing", "shanghai"},
}

And, the library provide a useful API for executing aggregate queries like count, sum, max, min, avg

where := map[string]interface{}{
    "score > ": 100,
    "city": []interface{}{"Beijing", "Shijiazhuang", }
}
// AggregateSum, AggregateMax, AggregateMin, AggregateCount, AggregateAvg are supported
result, err := AggregateQuery(ctx, db, "tableName", where, AggregateSum("age"))
sumAge := result.Int64()
result, err = AggregateQuery(ctx, db, "tableName", where, AggregateCount("*")) 
numberOfRecords := result.Int64()
result, err = AggregateQuery(ctx, db, "tableName", where, AggregateAvg("score"))
averageScore := result.Float64()

If you want to clear the zero value in the where map, you can use builder.OmitEmpty

where := map[string]interface{}{
		"score": 0,
		"age >": 35,
	}
finalWhere := builder.OmitEmpty(where, []string{"score", "age"})
// finalWhere = map[string]interface{}{"age >": 35}

// support: Bool, Array, String, Float32, Float64, Int, Int8, Int16, Int32, Int64, Uint, Uint8, Uint16, Uint32, Uint64, Uintptr, Map, Slice, Interface, Struct

For complex queries, NamedQuery may be helpful:

cond, vals, err := builder.NamedQuery("select * from tb where name={{name}} and id in (select uid from anothertable where score in {{m_score}})", map[string]interface{}{
	"name": "caibirdme",
	"m_score": []float64{3.0, 5.8, 7.9},
})

assert.Equal("select * from tb where name=? and id in (select uid from anothertable where score in (?,?,?))", cond)
assert.Equal([]interface{}{"caibirdme", 3.0, 5.8, 7.9}, vals)

slice type can be expanded automatically according to its length, thus these sqls are very convenient for DBA to review.
For critical system, this is recommended

For more detail, see builder's doc or just use godoc

Scanner

For each response from mysql, you want to map it with your well-defined structure. Scanner provides a very easy API to do this, it's based on reflection:
standard library
type Person struct {
	Name string
	Age int
}

rows, err := db.Query("SELECT age as m_age, name from g_xxx where xxx")
defer rows.Close()

var students []Person

for rows.Next() {
	var student Person
	rows.Scan(student.Age, student.Name)
	students = append(students, student)
}
using scanner
type Person struct {
	Name string `ddb:"name"`
	Age int `ddb:"m_age"`
}

rows, err := db.Query("SELECT age as m_age, name from g_xxx where xxx")
defer rows.Close()

var students []Person

scanner.Scan(rows, &students)

Types which implement the interface

type ByteUnmarshaler interface {
	UnmarshalByte(data []byte) error
}

will take over the corresponding unmarshal work.

type human struct {
	Age   int       `ddb:"ag"`
	Extra *extraInfo `ddb:"ext"`
}

type extraInfo struct {
	Hobbies     []string `json:"hobbies"`
	LuckyNumber int      `json:"ln"`
}

func (ext *extraInfo) UnmarshalByte(data []byte) error {
	return json.Unmarshal(data, ext)
}

//if the type of ext column in a table is varchar(stored legal json string) or json(mysql5.7)
var student human
err := scanner.Scan(rows, &student)
// ...

The extra tag of the struct will be used by scanner resolve data from response.The default tag name is ddb:"tagname", but you can specify your own such as:

scanner.SetTagName("json")
type Person struct {
	Name string `json:"name"`
	Age int `json:"m_age"`
}

// ...
var student Person
scanner.Scan(rows, &student)

scanner.SetTagName is a global setting and it can be invoked only once

ScanMap

rows, _ := db.Query("select name, age as m_age from person")
result, err := scanner.ScanMap(rows)
for _, record := range result {
	fmt.Println(record["name"], record["m_age"])
}

ScanMap scans data from rows and returns a []map[string]interface{}
int, float, string type may be stored as []uint8 by mysql driver, ScanMap just copy those value into the map. If you're sure that there's no binary data type in your mysql table(in most cases, this is true), you can use ScanMapDecode instead which will convert []uint8 to int, float64 or string

For more detail, see scanner's doc

PS:

  • Don't forget close rows if you don't use ScanXXXClose
  • The second parameter of Scan must be a reference

Tools

Besides APIs above, Gendry provide a [CLI tool](https://github.com/caibirdme/gforge) to help generating codes.
Owner
DiDi
滴滴出行
DiDi
Comments
  • bug about regex

    bug about regex

    https://github.com/didi/gendry/blob/master/builder/builder.go#L394 this regex is greedy. It should be non-greedy. for example, "select {{name}},{{comment}},value from table" In mysql, this is ok. but gendry will parser it as "name}},{{comment". I must insert an space between }} and {{ .

  • like语句支持空值判断?

    like语句支持空值判断?

    例如: ` where := builder.OmitEmpty(map[string]interface{}{ "I_STATUS": params.Status, "CH_BIZ_ID": params.BizId, "I_TYPE": params.Type, "I_CONTENT_TYPE": params.ContentType, "B_IS_DELETE": constant.IsNotDelete, "D_CREATE_DATE >": params.CreateBeginAt, "D_CREATE_DATE <=": params.CreateEndAt, }, []string{"I_TYPE", "I_CONTENT_TYPE", "D_CREATE_DATE >", "D_CREATE_DATE <="})

    if params.Title != "" {
    	where["CH_TITLE like"] = ...
    }`
    

    builder.OmitEmpty 无法判断空值,目前只能手动判断

  • 数据库字段为自带关键字时插入报错

    数据库字段为自带关键字时插入报错

    表结构如下: CREATE TABLE gateway ( id int(10) unsigned NOT NULL AUTO_INCREMENT, name varchar(64) NOT NULL DEFAULT '' COMMENT '网关名', desc varchar(256) NOT NULL DEFAULT '' COMMENT '描述', deleted tinyint(1) NOT NULL DEFAULT '0' COMMENT '是否已删除, 0:否、1:是', create_time int(10) unsigned NOT NULL DEFAULT '0' COMMENT '记录添加时间', update_time timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '记录更新时间', PRIMARY KEY (id) ) ENGINE=InnoDB AUTO_INCREMENT=3 DEFAULT CHARSET=utf8 COMMENT='网关配置表';

    使用gendry拼接的sql类似这样:

    [exec sql] sql: INSERT INTO gateway (create_time,deleted,desc,name) VALUES (?,?,?,?), args: [31 0 测试嘿嘿哈哈哈1053 gateway1053]

    注意: 这里的desc是数据库关键字,造成报错,建议在字段前后使用``包裹

  • 嵌套结构体无法读取

    嵌套结构体无法读取

    如下结构体:

    type ModelMeta struct {
    	ID    int       `json:"id" ddb:"id"`
    	Ctime time.Time `json:"ctime" ddb:"ctime"`
    	Mtime time.Time `json:"mtime" ddb:"mtime"`
    }
    
    
    type App struct {
            ModelMeta
    	Project string    `json:"project" ddb:"project"`
    	Name    string    `json:"name" ddb:"name"`
    }
    

    查询时候拿不到数据,将嵌套去掉的话则是OK的

  • Scanner 的 bind 函数不支持 当值是 time.Time 时 reflect.Scanner 接口

    Scanner 的 bind 函数不支持 当值是 time.Time 时 reflect.Scanner 接口

    使用的版本: v1.6.0

    因为一些原因,我需要使用自己封装的和时间有关的 Struct 。该 Struct 已经实现了 sql.Scannersql.Valuer 接口,使用 sql.Row.Scan 函数是正常的。在使用 scanner.Scan 时会出现错误:

    [scanner]: Role.CreatedTime is db.FormattedTime which is not AssignableBy time.Time
    

    阅读代码后,确定问题是 scanner.go:convert 函数中的这段代码导致的:

    //time.Time to string
    switch assertT := mapValue.(type) {
    case time.Time:
    	return handleConvertTime(assertT, mvt, vit, &valuei, wrapErr)
    }
    
    if scanner, ok := valuei.Addr().Interface().(sql.Scanner); ok {
    	return scanner.Scan(mapValue)
    }
    
    
    func handleConvertTime(assertT time.Time, mvt, vit reflect.Type, valuei *reflect.Value, wrapErr convertErrWrapper) error {
    	if vit.Kind() == reflect.String {
    		sTime := assertT.Format(cTimeFormat)
    		valuei.SetString(sTime)
    		return nil
    	}
    	return wrapErr(mvt, vit)
    }
    
    

    handleConvertTime 中,time.Time 只能转换成 string

    希望你们能关注一下这个问题。

  • Proposal: Support `BuildInsert` receive the struct rather than map[string]interface{}

    Proposal: Support `BuildInsert` receive the struct rather than map[string]interface{}

    Thanks for this fantastic repo.

    Now, We use scanner.ScanClose( rows , interface{} ) , and use reflect (tagName: ddb ) to fill the interface. And use the builder.BuildInsert( map[string]interface{} ) to build the insert sql.

    In my humble opinion, I think it is not consistent.

    Shall we can support some function signed as builder.BuildInsert( interface{} ) , And We could transfer interface{} to map[string]interface{} by reflect.

    For example:

    
    func ToMap(in interface{}, tagName string) (map[string]interface{}, error){
    	out := make(map[string]interface{})
    
    	v := reflect.ValueOf(in)
    
    	t := v.Type()
    
    	for i := 0; i < v.NumField(); i++ {
    		fi := t.Field(i)
    		if tagValue := fi.Tag.Get(tagName); tagValue != "" {
    			out[tagValue] = v.Field(i).Interface()
    		}
    	}
    	return out, nil
    }
    

    In this case, we could use builder.BuildInsert( TheStructWithDDBtag ). Maybe it will more consistent and convenient.

    How about this?

    If you don't mind, I'm glad to try to implement this feature.

  • 对Scanner增加UnmarshalBinary的支持。

    对Scanner增加UnmarshalBinary的支持。

    如果scan的target是一个struct,且其中的一个字断在数据库中是一个json的string,那么支持在scan的时候实现UnmarshalBinary这个接口解析这个字断。例子:

    type TestInfo struct { StrData string ddb:"str_data" StructData *StructData ddb:"struct_data" //这个字断在db中是char类型,存储的是一个json字符串({"foo":"foo","bar":"bar"}) }

    type StructData struct { Foo string json:"foo" Bar string json:"bar" }

    func (this *StructData) UnmarshalDB(data []byte) error { return json.Unmarshal(data, this) }

  • Where 过滤能否单独开放出来

    Where 过滤能否单独开放出来

    目前我们使用 where ( 一个 map[string]interface{} )一般是配套 builder 的,比如 builder.BuildSelect(table, where, selectFields) 带上一个 where 的 map。我这边有个需求是想使用 NamedQuery 做一个复杂查询,但是这样一来,where 查询部分也就不能通过 map 来表现了。

    是否可以通过将 whereConnector 这样的方法或者搞个类似的公开方法,将 where map 转换为具体 SQL 的这块功能开放给用户?这样的话,NamedQuery 也可以传入一个 where map 来做查询了。

    希望能够考量一下是否可行!谢谢!

  • Is it possible to add `_or` operator?

    Is it possible to add `_or` operator?

    gendry is a good tool to build SQL except for OR statements. Can we add a new operator to support it? Just like:

    cond, vals, err := builder.BuildSelect(
         "test",
         map[string]interface{}{
              "id >": 1,
    	  "_or": map[string]interface{}{
    	       "foo":   2,
    	       "bar >": 3,
    	  },
         },
         []string{
              "foo",
         },
    )
    

    it equals:

    SELECT `foo` FROM `test` WHERE `id` > 1 AND (`foo` = 2 OR `bar` > 3);
    
  • 新版本(1.3.0)使用 _orderby 排序报错

    新版本(1.3.0)使用 _orderby 排序报错

    • 使用1.1.1版本正常,使用1.3.0版本报错,移除 _orderby 排序则正常。
    • 报错信息:

    $ go run main.go panic: Error 1064: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '?' at line 1

    测试代码:https://github.com/dewuzeng/gendry-demo

  • bug:SQL Injection Attack in

    bug:SQL Injection Attack in "_orderby".

    func main() {
    	where := map[string]interface{}{
    		"_orderby": "I_ID DESC",
    	}
    	qry, arg, _ := builder.BuildSelect("b_dealers", where, []string{"I_ID"})
    	fmt.Println(qry, arg)
           // SELECT I_ID FROM b_dealers ORDER BY I_ID DESC []
    }
    

    compare to raw sql. there is SQL Injection Attack in "_orderby".

    B726F84D1B7CF396DE7B458111471E87

  • mysql传limit句柄错误

    mysql传limit句柄错误

    errmsg[Error 5007: prepare handle error] dnslookup[0] reuse[0] sql[/* {"log_id":"3230830605","xdb_comment":"1"} */ SELECT * FROM file_meta WHERE (isdelete=? AND server_filename=? AND user_id=?) LIMIT ?,?] sql_args[[0,"relay_expect",122050,0,100]]

  • scanner在go继承模式下,无法scan到父级tag

    scanner在go继承模式下,无法scan到父级tag

    一些需要通过数据库计算函数得到的值,无法直接使用继承模式scan,只能重新定义结构体 type Person struct { Name string ddb:"name" Age int ddb:"m_age" }

    type PersonMulti struct { Person sum int ddb:"sum" }

  • improve `scanner.Map`'s performance by caching resolved tags

    improve `scanner.Map`'s performance by caching resolved tags

    scanner.Map方法通过反射获取key name,开销较大,这里通过缓存结构体的key name解析结果来提升性能。

    具体表现如下,通过使用缓存可以将scanner.Map接口的吞吐提升1-2倍。

    $ go test -run=BenchmarkMap -bench=BenchmarkMap -cpu=1,2,4,8 -benchtime=20000000x -benchmem
    goos: darwin
    goarch: amd64
    pkg: github.com/didi/gendry/scanner
    cpu: Intel(R) Core(TM) i7-9750H CPU @ 2.60GHz
    BenchmarkMapWithCache           20000000               265.4 ns/op           360 B/op          4 allocs/op
    BenchmarkMapWithCache-2         20000000               152.2 ns/op           360 B/op          4 allocs/op
    BenchmarkMapWithCache-4         20000000                88.22 ns/op          360 B/op          4 allocs/op
    BenchmarkMapWithCache-8         20000000                70.87 ns/op          360 B/op          4 allocs/op
    BenchmarkMapDisableCache        20000000               695.3 ns/op           400 B/op          9 allocs/op
    BenchmarkMapDisableCache-2      20000000               348.7 ns/op           400 B/op          9 allocs/op
    BenchmarkMapDisableCache-4      20000000               195.2 ns/op           400 B/op          9 allocs/op
    BenchmarkMapDisableCache-8      20000000               169.8 ns/op           400 B/op          9 allocs/op
    PASS
    ok      github.com/didi/gendry/scanner  39.892s
    
A flexible and powerful SQL string builder library plus a zero-config ORM.

SQL builder for Go Install Usage Basic usage Pre-defined SQL builders Build SQL for MySQL, PostgreSQL or SQLite Using Struct as a light weight ORM Nes

Dec 30, 2022
A Go SQL query builder and struct mapper.

godb - a Go query builder and struct mapper godb is a simple Go query builder and struct mapper, not a full-featured ORM. godb does not manage relatio

Dec 6, 2022
write APIs using direct SQL queries with no hassle, let's rethink about SQL

SQLer SQL-er is a tiny portable server enables you to write APIs using SQL query to be executed when anyone hits it, also it enables you to define val

Jan 7, 2023
Parses a file and associate SQL queries to a map. Useful for separating SQL from code logic

goyesql This package is based on nleof/goyesql but is not compatible with it any more. This package introduces support for arbitrary tag types and cha

Oct 20, 2021
Go-sql-reader - Go utility to read the externalised sql with predefined tags

go-sql-reader go utility to read the externalised sql with predefined tags Usage

Jan 25, 2022
BQB is a lightweight and easy to use query builder that works with sqlite, mysql, mariadb, postgres, and others.

Basic Query Builder Why Simple, lightweight, and fast Supports any and all syntax by the nature of how it works Doesn't require learning special synta

Dec 7, 2022
sqlx is a library which provides a set of extensions on go's standard database/sql library

sqlx is a library which provides a set of extensions on go's standard database/sql library. The sqlx versions of sql.DB, sql.TX, sql.Stmt, et al. all leave the underlying interfaces untouched, so that their interfaces are a superset on the standard ones. This makes it relatively painless to integrate existing codebases using database/sql with sqlx.

Jan 7, 2023
Go library that stores data in Redis with SQL-like schema

Go library that stores data in Redis with SQL-like schema. The goal of this library is we can store data in Redis with table form.

Mar 14, 2022
A go Library for scan database/sql rows to struct、slice、other types. And it support multiple databases connection management

ploto A go Library for scan database/sql rows to struct、slice、other types. And it support multiple databases connection management It's not an ORM. wo

Nov 3, 2022
Querycrate - A simple library that loads and keeps SQL queries from files

QueryCrate A simple library for loading & getting string queries from files. How

Feb 15, 2022
Fluent SQL generation for golang

Squirrel is "complete". Bug fixes will still be merged (slowly). Bug reports are welcome, but I will not necessarily respond to them. If another fork

Dec 29, 2022
convert sql to elasticsearch DSL in golang(go)

_____ _ _ ____ _____ ___ ____ ____ ___ _ | ____| | / \ / ___|_ _|_ _|/ ___|/ ___| / _ \ | | | _| | | / _ \ \___ \ |

Jan 7, 2023
GoTSQL : A Better Way to Organize SQL codebase using Templates in Golang

GoTSQL - A Better Way to Organize SQL codebase using Templates in Golang Installation through Go Get command $ go get github.com/migopsrepos/gotsql In

Aug 17, 2022
SQL transaction wrapper on golang

TxWrapper TxWrapper is a sql transaction wrapper. It helps to exclude writing code for rollback and commit commands. Usage import ( "context"

Mar 14, 2022
Simple SQL escape and format for golang

sqlstring Simple SQL escape and format Escaping sql values //Format sql := sqlstring.Format("select * from users where name=? and age=? limit ?,?", "t

Sep 4, 2022
BigQuery database/sql golang driver

BigQuery SQL Driver This library is compatible with Go 1.17+ Please refer to CHA

Dec 7, 2022
Spansqlx - Spanner sql pkgs with golang

spansqlx spanner sql pkgs install go get github.com/reiot777/spansqlx usage Bel

Jan 15, 2022
Golang REST Layer SQL Storage Handler

This REST Layer resource storage backend stores data in a SQL Database using database/sql.

Feb 15, 2022
Go package for sharding databases ( Supports every ORM or raw SQL )
Go package for sharding databases ( Supports every ORM or raw SQL )

Octillery Octillery is a Go package for sharding databases. It can use with every OR Mapping library ( xorm , gorp , gorm , dbr ...) implementing data

Dec 16, 2022