entimport is a tool for creating Ent schemas from existing SQL databases.

entimport

entimport is a tool for creating Ent schemas from existing SQL databases. Currently, MySQL and PostgreSQL are supported. The tool can import to ent schema any number of tables, including relations between them.

Installation

Setup A Go Environment

If your project directory is outside GOPATH or you are not familiar with GOPATH, setup a Go module project as follows:

go mod init <project>

Install ent

go install entgo.io/ent/cmd/ent

After installing ent codegen tool, you should have it in your PATH. If you don't find it your path, you can also run: go run entgo.io/ent/cmd/ent

Create Schema Directory

Go to the root directory of your project, and run:

ent init

The command above will create /ent/schema/ directory and the file inside /ent/generate.go

Importing a Schema

Installing and running entimport

go run ariga.io/entimport/cmd/entimport
  • For example, importing a MySQL schema with users table:
go run ariga.io/entimport/cmd/entimport -dialect mysql -dsn "root:pass@tcp(localhost:3308)/test" -tables "users"

The command above will write a valid ent schema into the directory specified (or the default ./ent/schema):

.
├── generate.go
└── schema
    └── user.go

1 directory, 2 files

Code Generation:

In order to generate ent files from the produced schemas, run:

go run -mod=mod entgo.io/ent/cmd/ent generate ./schema

# OR `ent` init:

go generate ./ent

If you are not yet familiar with ent, you can also follow the quick start guide.

Usage

entimport  -h
Usage of ./entimport:
  -dialect string
        database dialect (default "mysql")
  -dsn string
        data source name (connection information)
  -schema-path string
        output path for ent schema (default "./ent/schema")
  -tables value
        comma-separated list of tables to inspect (all if empty)

Examples:

  1. Import ent schema from Postgres database

Note: add search_path=foo if you use non public schema.

go run ariga.io/entimport/cmd/entimport -dialect postgres -dsn "host=localhost port=5434 user=postgres dbname=test password=pass sslmode=disable" 
  1. Import ent schema from MySQL database
go run ariga.io/entimport/cmd/entimport -dialect mysql -dsn "root:pass@tcp(localhost:3308)/test"
  1. Import only specific tables:

Note: When importing specific tables:
if the table is a join table, you must also provide referenced tables.
If the table is only one part of a relation, the other part won't be imported unless specified.
If the -tables flags is omitted all tables in current database schema will be imported

go run ariga.io/entimport/cmd/entimport -dialect postgres -dsn "..." -tables "users,user_friends" 
  1. Import to another directory:
go run ariga.io/entimport/cmd/entimport -dialect postgres -dsn "..." -schema-path "/some/path/here"

Future Work

  • Index support (currently Unique index is supported).
  • Support for all data types (for example uuid in Postgres).
  • Support for Default value in columns.
  • Support for editing schema both manually and automatically (real upsert and not only overwrite)
  • Postgres special types: postgres.NetworkType, postgres.BitType, *schema.SpatialType, postgres.CurrencyType, postgres.XMLType, postgres.ArrayType, postgres.UserDefinedType.

Known Caveats:

  • Schema files are overwritten by new calls to entimport.
  • There is no difference in DB schema between O2O Bidirectional and O2O Same Type - both will result in the same ent schema.
  • There is no difference in DB schema between M2M Bidirectional and M2M Same Type - both will result in the same
  • ent schema.
  • In recursive relations the edge names will be prefixed with child_ & parent_.
  • For example: users with M2M relation to itself will result in:
func (User) Edges() []ent.Edge {
return []ent.Edge{edge.To("child_users", User.Type), edge.From("parent_users", User.Type)}
}

Feedback & Support

For discussion and support, open an issue or join our channel in the gophers Slack.

Comments
  • add Annotation

    add Annotation

    My mysql database table name does not use plural form. Now the generated files always need me to manually add annotations, which seems too inconvenient. I have solved this problem, but I am not sure whether it is the most good implementation like this image

    await...

  • Broken method name for MySQL DECIMAL fields

    Broken method name for MySQL DECIMAL fields

    When I generate the fields for a MySQL table with DECIMAL columns (DECIMAL(19, 6) in this case), I'm getting fields like this:

    field.Float64("cost")
    

    However, this leads to a compilation error, because the method name is actually Float, not Float64. Right now I have to change it by hand to:

    field.Float("cost")
    
  • import failed - pq argument of AND must be type boolean

    import failed - pq argument of AND must be type boolean

    Attempting to import from a postgres 13 database with this command:

    go run ariga.io/entimport/cmd/entimport -dialect postgres -dsn "host=localhost port=5432 user=postgres dbname=testdb password=tesdbPass sslmode=disable" -tables "services"

    It crashes with the error: entimport: schema import failed - postgres: querying schema tables: pq: argument of AND must be type boolean, not type information_schema.sql_identifier

    If I do not include the -tables param, it crashes with: schema import failed - entimport: invalid primary key - single part key must be present

  • join tables must be inspected with ref tables - append  `tables` flag

    join tables must be inspected with ref tables - append `tables` flag

    So I tried generating my ent schema using entimport and everything seemed to work just fine until I noticed that my Join table wasn't in the folder. I ran this command trying to generate the table:

    go run ariga.io/entimport/cmd/entimport -tables friendship  -dsn "<mydsn>"
    

    but I still got an error telling me that schema import failed:

    entimport: schema import failed - entimport: join tables must be inspected with ref tables - append 
    `tables` flag
    

    I've tried the append flag but I can't seem to find it in docs

  • invalid primary key - single part key must be present

    invalid primary key - single part key must be present

    What does the above error mean? The table I try to import has a primary key, not partioned.

    create table artikel
    (
        `Artikel-Id` int auto_increment primary key,
        requires_internal_protocol varchar(255) null
    )
        collate = latin1_german2_ci;
    

    Storage engine InnoDB with MySQL 5.7

  • ent cant generate from incorrect edge (mismatch field types)

    ent cant generate from incorrect edge (mismatch field types)

    ent generates an error trying to parse the generated schema from the ddl below. The provided error is: "mismatch field type between edge field "username" and id of type "User" (string != int32)"

    CREATE TABLE `users` (
      `id` int(11) NOT NULL AUTO_INCREMENT,
      `username` varchar(50) NOT NULL,
      `token` varchar(255) DEFAULT NULL,
      PRIMARY KEY (`id`),
      UNIQUE KEY `username` (`username`),
      UNIQUE KEY `token` (`token`)
    ) ENGINE=InnoDB AUTO_INCREMENT=23 DEFAULT CHARSET=utf8;
    
    CREATE TABLE `programme_uploads` (
      `id` int(11) NOT NULL AUTO_INCREMENT,
      `username` varchar(50) NOT NULL,
      `upload_time` datetime NOT NULL,
      PRIMARY KEY (`id`),
      KEY `username` (`username`),
      KEY `upload_time_index` (`upload_time`),
      CONSTRAINT `programme_uploads_ibfk_1` FOREIGN KEY (`username`) REFERENCES `users` (`username`) ON DELETE NO ACTION ON UPDATE NO ACTION
    ) ENGINE=InnoDB AUTO_INCREMENT=23 DEFAULT CHARSET=utf8;
    

    I think it might be missing the foreign key reference in the generation?

  • How to generate Ent schemas for whole Postgres DB

    How to generate Ent schemas for whole Postgres DB

    In my DB I have a few schemas and when I run:

    go run ariga.io/entimport/cmd/entimport -dsn "postgres://user:password@localhost:5432/db_name?sslmode=disable"
    

    it generates Ent schemas only for the public schema. I know that I can specify search_path=<schema_name> in the DSN query. But again, I have tables with the same names but inside different schemas. How did it suppose to work?

  • Bug: mysql/mariadb on update current_timestamp()

    Bug: mysql/mariadb on update current_timestamp()

    Both mysql and mariadb can accept a precision for current_timestamp

    mysql: https://dev.mysql.com/doc/refman/8.0/en/timestamp-initialization.html mariadb: https://mariadb.com/kb/en/current_timestamp/

    So the line case "default_generated on update current_timestamp", "on update current_timestamp": in extraAttr() function may need to be adjusted.

    Schema:

    create table test
    (
        id            bigint unsigned auto_increment
            primary key,
        expires_at    timestamp default current_timestamp() not null on update current_timestamp()
    )
        collate = utf8mb4_unicode_ci;
    
    

    I would send a PR but I cannot find the code in https://github.com/ariga/atlas

  • Blog post this feature

    Blog post this feature

    Did this get a blog post in the Ent web site ?

    the reason I raise this is because when I discuss ent with long time devs they hate that ent is not database first in terms of the source of the schema.

  • all: add mux for db connections

    all: add mux for db connections

    This PR is using a mux inspired by Atlas which is now in charge of the connection to DB. The SchemaImporter now only implements dialect specific converters. Updated the readme and CLI to use mux.

  • is it possible to add db pool option

    is it possible to add db pool option

    see documents on ent site sql.DB Integration we can set db options in the DB() in normal code.but in the generated code, we could not access the DB object. so if it is possible to add some functional option to set SetMaxIdleConns SetMaxOpenConns SetConnMaxLifetime

  • fix:table name may not be the same as in the database

    fix:table name may not be the same as in the database

    When I use entimport to generate ent code, if the table name in the database ends with s, it will be trimmed. In order to make the generated code consistent with the database, it is recommended not to call this method.

    for example, my table name is profit_and_loss, and the generated file name is profit_and_los.go

    // old version
    func typeName(tableName string) string {
    	return inflect.Camelize(inflect.Singularize(tableName))
    }
    
    // new version
    func typeName(tableName string) string {
    	return inflect.Camelize(tableName)
    }
    
  • not enough arguments in call to field.UUID

    not enough arguments in call to field.UUID

    I'm having an issue generating the schema from my existing postgres database. Ive been followign this post - https://entgo.io/blog/2021/10/11/generating-ent-schemas-from-existing-sql-databases/

    When i execgo generate ./ent i get the following error:

    entc/load: load schema dir: /home/kay/powerlevel/backend/ent/schema/authentication.go:19:276: not enough arguments in call to field.UUID have (string) want (string, driver.Valuer) exit status 1 ent/generate.go:3: running "go": exit status 1

    database

    
    -- CreateTable
    CREATE TABLE "users" (
        "id" UUID NOT NULL DEFAULT gen_random_uuid(),
        "email" TEXT NOT NULL,
        "created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
        "updated_at" TIMESTAMP(3),
        "deleted_at" TIMESTAMP(3),
    
        CONSTRAINT "users_pkey" PRIMARY KEY ("id")
    );
    
    -- CreateTable
    CREATE TABLE "authentication" (
        "id" UUID NOT NULL DEFAULT gen_random_uuid(),
        "type" TEXT NOT NULL,
        "service_id" TEXT NOT NULL,
        "session_id" TEXT NOT NULL,
        "password" TEXT NOT NULL,
        "created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
        "updated_at" TIMESTAMP(3),
        "deleted_at" TIMESTAMP(3),
        "user_id" UUID NOT NULL,
    
        CONSTRAINT "authentication_pkey" PRIMARY KEY ("id")
    );
    
    -- AddForeignKey
    ALTER TABLE "authentication" ADD CONSTRAINT "authentication_user_id_fkey" FOREIGN KEY ("user_id") REFERENCES "users"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
    

    I exec the following to get the code generated - go run ariga.io/entimport/cmd/entimport -dsn "postgres://root:password@localhost:5432/powerlevel?sslmode=disable"

    ent/schema/authentication.go

    // Code generated by entimport, DO NOT EDIT.
    
    package schema
    
    import (
    	"entgo.io/ent"
    	"entgo.io/ent/dialect/entsql"
    	"entgo.io/ent/schema"
    	"entgo.io/ent/schema/edge"
    	"entgo.io/ent/schema/field"
    	"github.com/google/uuid"
    )
    
    type Authentication struct {
    	ent.Schema
    }
    
    func (Authentication) Fields() []ent.Field {
    	return []ent.Field{field.UUID("id", uuid.UUID{}), field.String("type"), field.String("service_id"), field.String("session_id"), field.String("password"), field.Time("created_at"), field.Time("updated_at").Optional(), field.Time("deleted_at").Optional(), field.UUID("user_id").Optional(uuid.UUID{})}
    }
    func (Authentication) Edges() []ent.Edge {
    	return []ent.Edge{edge.From("user", User.Type).Ref("authentications").Unique().Field("user_id")}
    }
    func (Authentication) Annotations() []schema.Annotation {
    	return []schema.Annotation{entsql.Annotation{Table: "authentication"}}
    }
    

    ent/schema/user.go

    // Code generated by entimport, DO NOT EDIT.
    
    package schema
    
    import (
    	"entgo.io/ent"
    	"entgo.io/ent/schema"
    	"entgo.io/ent/schema/edge"
    	"entgo.io/ent/schema/field"
    	"github.com/google/uuid"
    )
    
    type User struct {
    	ent.Schema
    }
    
    func (User) Fields() []ent.Field {
    	return []ent.Field{field.UUID("id", uuid.UUID{}), field.String("email"), field.Time("created_at"), field.Time("updated_at").Optional(), field.Time("deleted_at").Optional()}
    }
    func (User) Edges() []ent.Edge {
    	return []ent.Edge{edge.To("authentications", Authentication.Type)}
    }
    func (User) Annotations() []schema.Annotation {
    	return nil
    }
    

    This is my first time using entgo, am i doing something wrong?

    go version go1.18.4 linux/amd64
    ariga.io/entimport v0.0.0-20220722070026-e5b57d96ab7c // indirect
    entgo.io/ent v0.11.1 // indirect
    
    

    Just playing around with it, it seems like in ent/schema/authentication.go this field.UUID("user_id").Optional(uuid.UUID{}) should have been field.UUID("user_id", uuid.UUID{}).Optional()? After editing that change and then attempting to generate, it does generate without errors. But it seems like i should not be editing these generated files. Although i dont think this manual change has worked properly to setup the edges. as the following statement brings back no authentication data despite it being the db. users, err := services.EntClient.User.Query().WithAuthentications().All(services.EntContext)

  • entimport fails to generate uuid fields correctly

    entimport fails to generate uuid fields correctly

    I believe entimport does not currently match the field.UUID API which accepts a second argument.

    I just got a field generated as: field.UUID("fooer_id").Optional(uuid.UUID{})

    where I believe the appropriate syntax is field.UUID("fooer_id", uuid.UUID{}).Optional()

  • Windows import existing schema erro

    Windows import existing schema erro

    go run ariga.io/entimport/cmd/entimport -dsn "mysql://root:pass@tcp(192.168.10.153:3306)/zzz" -tables "xxx"^C

    Same command do generate schema.go file properly in linux

  • add prefix/suffix if ent reserved words are detected

    add prefix/suffix if ent reserved words are detected

    Actual A DB has a clients table that collides with ent.Client and a subsequent change to the ent.Schema is necessary to fix the error.

    Suggestion When importing an existing database with entimport, there should be a prefix (or suffix) for the generated methods. In the configuration for entimport there should be an option to specify the prefix.

    Example prefix: legacy table: clients Result: LegacyClient

A simple database migration tool using an sql.DB connection and fs.FS for the migration source

A simple database migration tool using an sql.DB connection and fs.FS for the migration source. It has no non-test dependencies.

Dec 7, 2022
Django style fixtures for Golang's excellent built-in database/sql library.

go-fixtures Django style fixtures for Golang's excellent built-in database/sql library. Currently only YAML fixtures are supported. There are two rese

Sep 26, 2022
Goose database migration tool - fork of https://bitbucket.org/liamstask/goose

goose Goose is a database migration tool. Manage your database schema by creating incremental SQL changes or Go functions. Goals of this fork github.c

Dec 30, 2022
Simple Migration Tool - written in Go

Pravasan Simple Migration tool intend to be used for any languages, for any db. Please feel free to criticize, comment, etc. Currently this is working

Sep 26, 2022
Dbmate is a database migration tool, to keep your database schema in sync across multiple developers and your production servers.

Dbmate is a database migration tool, to keep your database schema in sync across multiple developers and your production servers. It is a stand

Jan 1, 2023
Simple migration tool for MySQL

prrn Simple migration tool for MySQL This is a CLI that helps you create a DB migration file. There is no need to write up and down files from scratch

Nov 10, 2021
Opinionated tool for database structure management and migrations

trek Requirements At least version 13 of postgres is needed. Installation go install . Setup Create config.yaml: model_name: <model_name> db_name: <db

Dec 14, 2022
A tool to compare if terraform provider migration schema snapshot is equal to schema defined in resource code

migration schema comparer for Terraform When develop Terraform provider sometimes we need do some state migration(not schema migration) via StateUpgra

Nov 18, 2021
Tool to handle versioned migrations with gorm

GORM Migrations About Gorm Migrations Gorm Migrations is a tool designed for go-gorm. Gorm Migration takes the pain out of development of migrations v

Mar 14, 2022
A database migration tool written in Go.

dbmagritte created by Austin Poor A database migration tool written in Go. Usage Commands: init: Set up the repo by creating a .dbmagritte.yaml file a

Jan 29, 2022
A logger for Go SQL database driver without modify existing *sql.DB stdlib usage.
A logger for Go SQL database driver without modify existing *sql.DB stdlib usage.

SQLDB-Logger A logger for Go SQL database driver without modify existing *sql.DB stdlib usage. Colored console writer output above only for sample/dev

Jan 3, 2023
Command line tool to generate idiomatic Go code for SQL databases supporting PostgreSQL, MySQL, SQLite, Oracle, and Microsoft SQL Server

About xo xo is a command-line tool to generate Go code based on a database schema or a custom query. xo works by using database metadata and SQL intro

Jan 8, 2023
A tool to generate Pulumi Package schemas from Go type definitions

MkSchema A tool to generate Pulumi Package schemas from Go type definitions. This tool translates annotated Go files into Pulumi component schema meta

Sep 1, 2022
Quick and easy expression matching for JSON schemas used in requests and responses

schema schema makes it easier to check if map/array structures match a certain schema. Great for testing JSON API's or validating the format of incomi

Dec 24, 2022
BLS signature and multi-signature schemas in Go and Solidity

BLS signature and multisignature schemas in Go and Solidity This code demonstrates the following schemas. Sign functions signature are in GoLang, veri

Jul 26, 2022
sql 语句转成 ent schema
sql 语句转成 ent schema

sql2ent 该项目提供 sql 语句转化为 entgo schema 代码的命令行工具, 以提高工作效率。 前往学习entgo 功能 已完成 使用命令行批量转化 支持 mysql 计划 支持更多的数据库,例如:MariaDB、SQLite、PostgreSQL。 读取数据库,批量生成 schem

Dec 13, 2022
sqlcomment is an ent driver that adds SQL comments following sqlcommenter specification.

sqlcomment sqlcomment is an ent driver that adds SQL comments following sqlcommenter specification. sqlcomment includes support for OpenTelemetry and

Nov 14, 2022
OctoSQL is a query tool that allows you to join, analyse and transform data from multiple databases and file formats using SQL.
OctoSQL is a query tool that allows you to join, analyse and transform data from multiple databases and file formats using SQL.

OctoSQL OctoSQL is a query tool that allows you to join, analyse and transform data from multiple databases, streaming sources and file formats using

Dec 29, 2022
sq is a command line tool that provides jq-style access to structured data sources such as SQL databases, or document formats like CSV or Excel.

sq: swiss-army knife for data sq is a command line tool that provides jq-style access to structured data sources such as SQL databases, or document fo

Jan 1, 2023
Go package for sharding databases ( Supports every ORM or raw SQL )
Go package for sharding databases ( Supports every ORM or raw SQL )

Octillery Octillery is a Go package for sharding databases. It can use with every OR Mapping library ( xorm , gorp , gorm , dbr ...) implementing data

Dec 16, 2022