Upgrading from v3 to v4 of @tableland/sdk

Upgrading from v3 to v4 of @tableland/sdk

The new Tableland SDK is a total refactor and totally changes how developers should use it—for the better!

Upgrading from v3 to v4 of @tableland/sdk

The new Tableland SDK is a total refactor and totally changes how developers use it—for the better!

By @Carson Farmer

TL;DR So you’re ready to upgrade your dApp and/or project to the latest and greatest version of the Tableland SDK? Great, welcome to the future! In this post, I’ll cover some of the gotchas and things to consider when upgrading your App or dApp!

First things first, you’ll want to read the announcement post which outlines the new features of our fantastic new API. Once you’ve done that, pop back here and let’s get you going!

Once you’re back and familiar with the new features of the updated SDK you’ll probably end up removing more code that you write (which is always nice). So let’s start with a quick library upgrade:

npm i @tableland/[email protected]

This will fetch the latest release of the 4.x version of the SDK. We’ll be maintaining the 3.x versions for a little while to help folks make the transition… but after a month or two, that branch will be deprecated completely, so you’re doing the right thing by upgrading now.

If you’re in a Typescript project, you’ll likely see all sorts of type issues. This is extremely useful as it will point you right to the things you need to change. If you aren’t, or want to find things manually, just remember that all the create, read, and write method calls are now part of the core Statement API which you can access via a Database instance. And the individual calls have been merged into a “single” API (you can just call stmt.all(), stmt.run(), etc).


If you only ever need to read from the Tableland network, then you’re in luck… things just got a whole lot easier. We anticipate there will be quite a few common applications where you just need to read some data from a table or tables and display or use that data in your app in some way. For this, you can swap out something like this:

import { connect } from "@tableland/sdk";

const tableName = "healthbot_80001_1"; // Our pre-defined health check table

const conn = await connect({ network: "testnet", chain: "polygon-mumbai" });
// The above will generally cause a MetaMask popup to sign a SIWE token, can also do it manually
await conn.siwe();

const { data } = await conn.read(`SELECT * FROM ${tableName};`);

Which requires signing, for this (without any signing required):

import { Database } from "@tableland/sdk";

const tableName = "healthbot_80001_1"; // Our pre-defined health check table

const db = Database.readOnly("maticmum"); // Polygon Mumbai testnet

const { results } = db.prepare(`SELECT * FROM ${tableName};`).all();

You’ll also notice we now use the same naming conventions for networks as ethersjs. There are some other features of ethersjs that Tableland has adopted to make developer lives easier. We’ve outlined a few in this doc, and there are more as you dive deeper into the docs.

There is a safer way to inject table names into SQL queries like the above using @databasees/sql see our tests for an example.


While the above is slightly simpler for the average use case, the increased simplicity becomes a lot more obvious when you are doing mutating queries, creating new tables, and just generally interacting with the Tableland smart contracts. For instance, the old way to create, mutate, and then query a table looked something like this:

import { connect } from "@tableland/sdk";

// Connect to the Tableland testnet
// Default to grabbing a MetaMask connection in a browser
const conn = await connect();
// For client-side apps, call `siwe` to prompt a browser wallet sign-in flow
await conn.siwe();

// Create a new table with a supplied SQL schema and optional `prefix`
const { name } = await tableland.create(
  `id integer primary key, name text`, // Table schema definition
    prefix: `my_sdk_table` // Optional `prefix` used to define a human-readable string

// The table's `name` is in the format `{prefix}_{chainId}_{tableId}`
console.log(name); // e.g., my_sdk_table_80001_311
// Without the supplied `prefix`, `name` would be be `_80001_311`

// Insert a row into the table
const writeRes = await tableland.write(`INSERT INTO ${name} (id, name) VALUES (0, 'Bobby Tables');`);

// Perform a read query, requesting all rows from the table
const readRes = await tableland.read(`SELECT * FROM ${name};`);

With the new SDK, the above is simplified significantly, and only requires the following set of commands. Additionally, it is safer and simpler to use with parameter binding, and you can control more easily which statements you wait for (and more). Additionally, it feels a lot more like you are “just writing SQL” statements:

import { Database } from "@tableland/sdk";

// Default to grabbing a MetaMask connection in a browser
const db = new Database()

const { meta: create } = await db
	.prepare("CREATE TABLE my_sdk_table (id integer primary key, name text);")

// The table's `name` is in the format `{prefix}_{chainId}_{tableId}`
console.log(create.txn.name); // e.g., my_sdk_table_80001_311

// Insert a row into the table
const { meta: insert } = await db
	.prepare(`INSERT INTO ${name} (id, name) VALUES (?, ?);`)
	.bind(0, "Bobby Tables")

// Wait for transaction finality
await insert.txn.wait();

// Perform a read query, requesting all rows from the table
const { results } = await db.prepare(`SELECT * FROM ${name};`).all();
In the old way, the statements were being relayed by the Validator(s) so only a single signature was needed. In the new version, the Validators are no longer relaying transactions, so it might require two signed transactions. For multiple writes, it is also possible to use the db.batch([…]) method to batch these calls into a single signed transaction (which also uses less gas).

More control

For folks that were using the old SDK to do more fine-grained access to Tableland, such as getting, setting, and locking controller contracts, or listing a user’s tables, etc you might find that these operations don’t fit within the Database abstraction provided above. This is where the two additional core APIs come into play.

For direct access to API calls on the Validator(s), you can leverage the Validator class:

import { Validator } from "@tableland/sdk";

// Pull info from an existing Database instance
const obj = await new Validator(db.config); // Must have baseUrl defined

const isHealthy = await obj.health();
console.log(isHealthy) // true

const { name, schema } = await obj.getTableById({ chainId: 80001, tableId: "1" });
console.log(name) // healthbot_31337_1
	columns: [
			name: "counter",
			type: "integer",

Similarly, for more direct access to the Tableland Tables smart contract methods, you can leverage the Registry class:

import { Registry } from "@tableland/sdk";
import { getContractReceipt } from "@tableland/sdk/helpers";
// Can't use conditional exports? You can also soon use:
// import { helpers } from "@tableland/sdk"
// const { getContractReceipt } = helpers;

// Pull info from an existing Database instance
const reg = new Registry(db.config); // Must have signer defined

const tx = await reg.createTable({
	chainId: 31337,
	statement: "create table test_ownership_31337 (id int, name text)",
// Helper function to extract table name event information
const receipt = await getContractReceipt(tx);

// List my tables
const results = await reg.listTables(/* default to connected wallet address */);

// Transfer the above table to my friend!
const tx = await reg.safeTransferFrom({
	to: friendAddress,
	tableName: receipt, // Also accepts name as string
// Tableland adopts this "wait" style pattern from ethers!
await tx.wait();


With the new SDK, we have made third party library integrations our top priority. For example, if you are writing Tableland interactions inside a React app that uses something like wagmi, the above examples might start off something like the following (inside your components/hooks):

import { useSigner } from "wagmi";
import { Database } from "@tableland/sdk";
function App() {
  const { data: signer } = useSigner()
	const db = Database.fromSigner(signer);

Additionally, thanks to our support for Cloudflare’s D1Database interface, support for an ORM is possible via d1-orm. Here’s a quick example of creating, updating, and querying a table via a Model object:

import {
   type Infer,
 } from "d1-orm";
import { Database } from "@tableland/sdk";

const db = new Database({ autoWait: true });
const orm = new D1Orm(db);

const users = new Model(
   D1Orm: orm,
   tableName: "users",
   primaryKeys: "id",
   uniqueKeys: [["email"]],
   id: {
     type: DataTypes.INTEGER,
     notNull: true,
   name: {
     type: DataTypes.STRING,
     notNull: true,
   email: {
     type: DataTypes.STRING,
type User = Infer<typeof users>;

const { meta: create } = await users.CreateTable({
	strategy: "default",

// Slight temporary hack
(users.tableName as any) = create.txn.name;

await users.InsertOne({
	name: "Bobby Tables",
	email: "[email protected]",

const { results } = await users.All({
	where: { name: "Bobby Tables" },
	limit: 1,
	offset: 0,
	orderBy: ["id"],

Additional integrations provide some client-side safety for injecting table names, query parameters, and more via prepared statement syntax. While you don’t need @databases/sql to leverage prepared statements with the Tableland SDK, it does provide some nice methods for working with raw SQL strings, so we leverage it here:

import sql, { FormatConfig } from "@databases/sql";
import { escapeSQLiteIdentifier } from "@databases/escape-identifier";
import { Database } from "@tableland/sdk";

// See https://www.atdatabases.org/docs/sqlite
const sqliteFormat: FormatConfig = {
	escapeIdentifier: (str) => escapeSQLiteIdentifier(str),
	formatValue: (value) => ({ placeholder: "?", value }),

// First, we'll test out using sql identifiers
const primaryKey = sql.ident("id");
const query = sql`CREATE TABLE test_sql (${primaryKey} integer primary key, counter integer, info text);`;
const { text, values } = query.format(sqliteFormat);
const { meta } = await db.prepare(text).bind(values).run();
const { name } = await meta.txn.wait();
console.log(`Created table ${name}`);

What about all those fancy ethersjs tools out there? We can leverage those in Tableland quite nicely, as we have pretty direct control over the Signer interface that drives our database mutations. Here’s how you might instantiate a Database within a Node app:

import { NonceManager } from "@ethersproject/experimental";
import { Database } from "@tableland/sdk";
import { Wallet, getDefaultProvider } from "ethers";

// Or maybe you want to use the dotenv package
const privateKey = process.env.PRIVATE_KEY;

const wallet = new Wallet(privateKey);
const provider = getDefaultProvider("");
// const signer = wallet.connect(provider);
const baseSigner = wallet.connect(provider);
// Also demonstrates the nonce manager usage
const signer = new NonceManager(baseSigner);
const db = new Database({ signer });

// No need to await individual transations (due to nonce manager)!


As you can see, updates are pretty minimal in most cases, and the range of API calls needed has been reduced. Additionally, the modular design of the SDK means developers only need to import and use the part of the SDK that make sense for their application. Don’t need to mutate tables? Great, don’t bother with all that signing stuff, just read and be done with it! Want more direct access to transaction receipts? Sure, use the Validator API? What about writing directly to the smart contract? We got ya, just import the Registry API. Need to know which chains are supported by the SDK? We have a helper function for that, and just about anything else you’ll need when working with Tableland.

And remember, you have a little time to make the transaction to the new APIs and tools. You’ll be happy once you do! And in the mean time, if you have any questions, bug reports, concerns, or just want to nerd out, don’t hesitate to hop into our discord and say hey!