Error: No such container: ee-global-db

Trying to import the database using the command:

docker exec -i ee-global-db mysql -udbuser -pdbpass --database=dbname < dump.sql

But I get an error: “Error: No such container: ee-global-db”

Please, help me.

Use this command, it’ll export to the site’s htdocs.

ee shell example.com --command='wp db export'

I used the command:
ee shell example.com --command=‘wp db import db.sql’

But import lasts for 3 hours (database over 8Gb). How to integrate this solution https://stackoverflow.com/questions/2167522/innodb-takes-over-an-hour-to-import-600mb-file-myisam-in-a-few-minutes/2167641#2167641 ?

Looks simple enough, just edit the sql file?

Found a problem. Not enough disk space when importing the database. Logs of import occupy a lot of space.

root@site:/var/lib/docker/volumes# pwd
/var/lib/docker/volumes
root@site:/var/lib/docker/volumes# du -shc *
7.2G global-db_db_data
7.0G global-db_db_logs

How can I disable logging during import?

Update. Here is answer MariaDB logs filled up my entire disk space!

I found this answer.

https://geert.vanderkelen.org/2009/disabling-binary-logging-when-restoring-a-mysql-dump/

bash $ (echo "SET SESSION SQL_LOG_BIN=0;"; cat dump.sql) > dump_nobinlog.sql

A better solution would have been the following. Taken from the comments on the above site. But as the volume of data is rather large I don’t want to spend another few hours waiting for the data to be exported. This also does not include compress of the file at dump time.

$ echo "SET SESSION SQL_LOG_BIN=0;" > dumpfile   
$ mysqldump .... >> dumpfile

I have adapted it as follows.

echo "SET SESSION SQL_LOG_BIN=0;" | gzip | zcat - /somewhere/withspace/dump/somelargedb.sql.gz | mysql -u root -p somelargedb