#dumpbucket
Explore tagged Tumblr posts
nawnlynyer · 4 years ago
Text
Tumblr media
Feel lighter...with our new Extra Heavy Duty Dump Buckets!
0 notes
chal-lelerc · 5 years ago
Text
Ah yes how fitting that Hermit Meerkat isn’t wearing the mask properly as he has a chat with Maxy
23 notes · View notes
globalmediacampaign · 5 years ago
Text
How to Migrate On-premises MySQL Enterprise Database to OCI Compute Instance MySQL Enterprise Database ?
Migrating On-premises MySQL Enterprise Database to OCI Compute Instance MySQL Enterprise Database ? In this post , we will walk through steps needed to migrate particular database(example - Sales database) from local Instance(On-premises) to Oracle Cloud Infrastructure Compute Instance. We will use two new features introduced with latest release of MySQL 8.0.21. 1.       Dump Schema Utility a.       This will help us to take the backup from On-promises database and export  to Oracle Cloud Object Storage. 2.       Load Dump Utility a.       This is help us to Import the schema from Object Storage to local compute Instance. More info:- https://dev.mysql.com/doc/mysql-shell/8.0/en/mysql-shell-utilities-dump-instance-schema.html  How does Migration Work ? Suppose you wanted to do lift-shift of database called “sales” , so “Schema Dump” utility will export sales database from on-premises to OCI(oracle cloud infrastructure) object storage. Then “Load Dump” utility will import directly to MySQL Instance running in OCI compute instance. Below diagram I have made for clear understanding… What do we needed handy 1.       MySQL Shell 8.0.21 Version. 2.       On-premises MySQL Up and Running. 3.       Cloud Instances Up and Running. 4.       Install OCI CLI on On-premises Machine. 5.       Install OCI CLI on OCI(Oracle Cloud Compute Instance) 6.       local_infile variables must be ON  for destination machine. Additional Details On-Premises Instance Details Oracle Cloud Compute Instance Details Database Name:- Sales IP:- 192.168.1.10 User:- root Port: 3306 Public IP Address: 140.238.227.33 MySQL Username:- admin MySQL Port: 3306 MySQL Host: 127.0.0.1  Command to Export the backup from On-premises to OCI Object Storage MySQL  localhost:33060+ ssl  Py >  util.dump_schemas(["sales"], "worlddump", {"osBucketName": "BootcampBucket", "osNamespace": "idazzjlcjqzj",  "ocimds": "true","ociConfigFile":"/root/.oci/config", "compatibility": ["strip_definers", "strip_restricted_grants"]}) Checking for compatibility with MySQL Database Service 8.0.21 NOTE: Database sales had unsupported ENCRYPTION option commented out Compatibility issues with MySQL Database Service 8.0.21 were found and repaired. Please review the changes made before loading them. Acquiring global read lock All transactions have been started Locking instance for backup Global read lock has been released Writing global DDL files Preparing data dump for table `sales`.`employee` Writing DDL for schema `sales` Writing DDL for table `sales`.`employee` Data dump for table `sales`.`employee` will be chunked using column `empid` Running data dump using 4 threads. NOTE: Progress information uses estimated values and may not be accurate. Data dump for table `sales`.`employee` will be written to 1 file 1 thds dumping - 100% (2 rows / ~2 rows), 0.00 rows/s, 12.00 B/s uncompressed, 0.00 B/s compressed Duration: 00:00:03s Schemas dumped: 1 Tables dumped: 1 Uncompressed data size: 39 bytes Compressed data size: 0 bytes Compression ratio: 39.0 Rows written: 2 Bytes written: 0 bytes Average uncompressed throughput: 10.61 B/s Average compressed throughput: 0.00 B/s  MySQL  localhost:33060+ ssl  Py >   Import Dump file into Compute Instance from OCI Object Storage util.loadDump("worlddump", {threads: 8, osBucketName: "BootcampBucket", osNamespace: "idazzjlcjqzj","ociConfigFile":"/root/.oci/config" }) Verify The results On-Premises Database:- Conclusion:- Migration happened successfully!!! MySQL Shell 8.0.21 makes MySQL easier to use, by providing an interactive MySQL client supporting SQL, Document Store, JavaScript & Python interface with support for writing custom extensions. And with dumpInstance(), dumpSchemas(), importTable() and loadDump()  MySQL Shell now provides powerful logical dump and load functionality. ===========Rough Notes========================================= So doing this Migration what are challenges has come? what are Error has occurred and most important how did you fix it up? Let's have a look for all Error one by one... Error#01:- Traceback (most recent call last):   File "", line 1, in SystemError: RuntimeError: Util.dump_schemas: Cannot open file: /root/.oci/config.  Fix:-  Install OCI CLI in your local machine Installing the CLI (document available on D:ChandanCloudOCI_Notes) https://docs.cloud.oracle.com/en-us/iaas/Content/API/SDKDocs/cliinstall.htm   Error:#02- MySQL  localhost:33060+ ssl  Py > util.dump_schemas(["sales"], "salesdump", {"osBucketName": "BootcampBucket", "osNamespace": "idazzjlcjqzj",  "ocimds": "true", "compatibility": ["strip_definers", "strip_restricted_grants"]}) Traceback (most recent call last):   File "", line 1, in SystemError: RuntimeError: Util.dump_schemas: Failed to get object list using prefix 'salesdump/': The required information to complete authentication was not provided. (401) Fix:-  util.dump_schemas(["sales"], "worlddump", {"osBucketName": "dumpbucket-2", "osNamespace": "idazzjlcjqzj",  "ocimds": "true", "compatibility": ["strip_definers", "strip_restricted_grants"]})  Note:- don’t change the name of worlddump. Error#03 MySQL  localhost:33060+ ssl  Py > util.dump_schemas(["sales"], "worlddump", {"osBucketName": "BootcampBucket", "osNamespace": "idazzjlcjqzj",  "ocimds": "true","ociConfigFile":"/root/.oci/config", "compatibility": ["strip_definers", "strip_restricted_grants"]}) Traceback (most recent call last):   File "", line 1, in SystemError: RuntimeError: Util.dump_schemas: Failed to list multipart uploads: Either the bucket named 'BootcampBucket' does not exist in the namespace 'idazzjlcjqzj' or you are not authorized to access it (404) Fix:- make sure you have set policies Allow group chandangroup to read buckets in compartment chandankumar-sandbox Allow group chandangroup to manage objects in compartment chandankumar-sandbox where any {request.permission='OBJECT_CREATE', request.permission='OBJECT_INSPECT'} Allow group chandangroup to read objects in compartment chandankumar-sandbox   Error#04    MySQL  localhost:33060+ ssl  JS > util.dumpSchemas(["sales"], "worlddump", {"osBucketName": "BootcampBucket", "osNamespace": "idazzjlcjqzj",  "ocimds": "true","ociConfigFile":"/root/.oci/config", "compatibility": ["strip_definers", "strip_restricted_grants"]}) Checking for compatibility with MySQL Database Service 8.0.21 NOTE: Database sales had unsupported ENCRYPTION option commented out Compatibility issues with MySQL Database Service 8.0.21 were found and repaired. Please review the changes made before loading them. Acquiring global read lock All transactions have been started Locking instance for backup Global read lock has been released Writing global DDL files Preparing data dump for table `sales`.`employee` Writing DDL for schema `sales` Writing DDL for table `sales`.`employee` WARNING: Could not select a column to be used as an index for table `sales`.`employee`. Chunking has been disabled for this table, data will be dumped to a single file. Running data dump using 4 threads. NOTE: Progress information uses estimated values and may not be accurate. Data dump for table `sales`.`employee` will be written to 1 file ERROR: [Worker001]: Failed to rename object 'worlddump/[email protected]' to 'worlddump/[email protected]': Either the bucket named 'BootcampBucket' does not exist in the namespace 'idazzjlcjqzj' or you are not authorized to access it (404) Util.dumpSchemas: Fatal error during dump (RuntimeError) Fix Make sure you have given Rename Permission/policies to objects   Allow group chandangroup to manage objects in compartment chandankumar-sandbox where any {request.permission='OBJECT_CREATE', request.permission='OBJECT_INSPECT', request.permission='OBJECT_OVERWRITE',request.permission='OBJECT_DELETE'} Final Policies:- go to OCI console --> Identity-->Policies--> add Policy Thank you for using MySQL!!! Please test and let us know your feedback... Share your feedback on improvement on my blog ,thank you! https://mysqlsolutionsarchitect.blogspot.com/2020/08/how-to-migrate-on-premises-mysql.html
0 notes
martyniiii · 5 years ago
Text
MAG Duskhollow PD - Part 2
New Magnus episode! MAG Duskhollow PD - Part 2
Join Jonny, Frank, Sasha, Alex and Lowri as they play MacGuffin & Co's original TTRPG 'Duskhollow PD'
This second episode sees the team split their resources to follow up the leads they've found. Emma and Archie head to 'The Last Drop' while Gabrielle Schreiber and Detective Sergeant Carl Husselback pay Angela Grockle a visit.
Thanks to this week's Patrons: Sofia Sandino, Cecil Goodwin, Nell, Masha Weber, James Candalino, Kit Buss, Rocket Corgi, Terrence Rollin' with a Grand, Connor Adderley, Rowan Grey, Liz MacGregor, Ealasaid Haas, Hhhh, ex libris crow, Ryan Norrie, Ivanna A. Villamil, Dumpbucket, o55k, Breeanna Sveum, & Sam
If you would like to join them, be sure to visit www.patreon.com/rustyquill
Edited this week by Michelle O'Toole, Tessa Vroom & Alexander J Newall.
In this episode, we used The Cthulhu Hack created by Paul Baldowski, to find out more or support Paul, head to:
cthulhuhack.com
patreon.com/cthulhuhack
If you'd like to learn more about or support MacGuffin & Co.head to
https://www.macguffinandcompany.com/
https://www.patreon.com/macguffinandcompany
Check out our merchandise at https://www.redbubble.com/people/rustyquill/collections/708982-the-magnus-archives-s1
You can subscribe to this podcast using your podcast software of choice, or by visiting www.rustyquill.com/subscribe
Please rate and review on your software of choice, it really helps us to spread the podcast to new listeners, so share the fear.
Join our community:
WEBSITE: rustyquill.com
FACEBOOK: facebook.com/therustyquill
TWITTER: @therustyquill
REDDIT: reddit.com/r/RustyQuill
DISCORD: https://discord.gg/KckTv8y
The Magnus Archives is a podcast distributed by Rusty Quill Ltd. and licensed under a Creative Commons Attribution Non-Commercial Sharealike 4.0 International Licence
For information regarding your data privacy, visit acast.com/privacy https://ift.tt/2SpoGTH
0 notes
sarahdahlinger · 11 years ago
Link
New work/experiments...
1 note · View note
dirtyelbows · 11 years ago
Photo
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
Experimenting with a new structure/projection surface. This was intended to be for my Dump Bucket Aquarium in the vein of what will become my thesis work next year but for now it's back to the drawing board.
2 notes · View notes