#lib and aurora interacts
Explore tagged Tumblr posts
Text
HERE ARE THE AURORA BDAY DOODLES !

She's doing the ✨ eyes ✨
And she got a birthday hat !

Aurora is letting everyone know it's her bday !
Lib by @soul100, forest by @julia-jck (pls don't reblog this T.T), geno by loverofpiggies (I think?) and idk who created Ccino sry IwI''
Lil small comic:


Transcript (or whatever it's called):
Luminary (aurora): ''when is OUR birthday then ?''
Luminary (Lib): ''we're not even really an aurora.''
(since they are aurora and lib at the same time)
Obscure: why so exited ?
(She's emo/j) (I mean she understand she'd be exited but why THAT exited)
Dream!aurora (I think she's called like that?): ????? (She's confused)
These aurora varients were made by @soul100 !!

AND SILLY EVIL CUTIE AURORA WKAVDHSG
Gonna post the rest in a bit ! :p
(btw I didn't have the ref for any of them T.T)
#art#my art#aurora sans#ut au#utmv#undertalleau#aurora!sans#aurora#aurora's bday#aurora's tale#obscure aurora sans#obscure aurora!sans#obscure aurora#obscure!aurora#dream!aurora#dream!aurora sans#luminary sans#lib and aurora interacts#aurora and lib fusion#luminary#ccino sans#ccino#geno sans#geno#forest!sans#forest#forest sans#lib sans#library!sans#library sans
38 notes
·
View notes
Text



@aurora-starlight-silly I found some doodles of the two :3
15 notes
·
View notes
Note
Hi magesmith
How well do your characters interact with children? Rank them please.
hi hello i have. sosososososo many characters for whom ive thought about this
10/10 child interactions, no notes: Ember, Dusk, Typha, Emerald, Veratrum, Isa, Annie, Dian, Mags, Aurora, Ivan, [REDACTED from Whispers x2]
7/10 could improve but not bad: Nimbus, Autumn, Arthur, Dakarsa, Marika, the Shadow, Andy, Arkady, Rosarian
5/10 mid at BEST: Gab, Beta, Jett, Izak, Xavi, Tieling, Penn, Iggy, Lakia, Dawn, Citylady and -lord Palm
2/10 pretty bad but not maliciously: Xiv, Noah, Kim, Adaine, [REDACTED from Whispers x1]
0/10 actively malicious/abusive: Actaea, [REDACTED from Whispers x2], the old Highpriestesses Delta, Lib, and Venn, and Citylord Polaris
#a&a#talking to: sleepy#many of these are actually confirmed on screen too#including the Shadow being honestly really really good with kids#just yknow. scary. using them as their parents' or siblings' collateral.#but in the way where the kids would be taken and live long happy lives never seeing those people again#i like thinking about this as a character thing bc like. im one of those people who's probably a solid 7/10#and i like thinking about what makes a difference in that score#and how that reflects in the characters in general#q
2 notes
·
View notes
Text
AWS CDK database queries in PostgreSQL and MySQL

With support for the AWS Cloud Development Kit (AWS CDK), AWS are now able to connect to and query your current MySQL and PostgreSQL databases. This is a new feature that allows you to construct a secure, real-time GraphQL API for your relational database, either inside or outside of Amazon Web Services (AWS). With merely your database endpoint and login credentials, you can now construct the full API for all relational database operations. You can use a command to apply the most recent modifications to the table schema whenever your database schema changes.
With the release of AWS Amplify GraphQL Transformer version 2, which was announced in 2021, developers can now create GraphQL-based app backends that are more feature-rich, adaptable, and extensible with little to no prior cloud experience. In order to create extensible pipeline resolvers that can route GraphQL API requests, apply business logic like authorization, and interact with the underlying data source like Amazon DynamoDB, this new GraphQL Transformer was completely redesigned.
But in addition to Amazon DynamoDB, users also desired to leverage relational database sources for their GraphQL APIs, including their Amazon RDS or Amazon Aurora databases. Amplify GraphQL APIs now support @model types for relational and DynamoDB data sources. Data from relational databases is produced into a different file called schema.sql.graphql. You may still build and maintain DynamoDB-backed types with standard schema.graphql files.
Upon receiving any MySQL or PostgreSQL database information, whether it is accessible publicly online or through a virtual private cloud (VPC), AWS Amplify will automatically produce a modifiable GraphQL API that can be used to securely connect to your database tables and expose CRUD (create, read, update, or delete) queries and mutations. To make your data models more frontend-friendly, you may also rename them. For instance, a database table with the name “todos” (plural, lowercase) may be accessible to the client as “ToDo” (single, PascalCase).
Any of the current Amplify GraphQL authorization rules can be added to your API with only one line of code, enabling the smooth development of use cases like owner-based authorization and public read-only patterns. Secure real-time subscriptions are accessible right out of the box because the produced API is based on AWS AppSync’s GraphQL capabilities. With a few lines of code, you can subscribe to any CRUD event from any data model.
Starting up the MySQL database in the AWS CDK
The AWS CDK gives you the significant expressive capability of a programming language to create dependable, scalable, and affordable cloud applications. Install the AWS CDK on your local computer to begin.
To print the AWS CDK version number and confirm that the installation is correct, use the following command.
Next, make your app’s new directory:
Use the cdk init command to set up a CDK application.
Add the GraphQL API construct from Amplify to the newly created CDK project.
Launch your CDK project’s primary stack file, which is often found in lib/<your-project-name>-stack.ts. Add the following imports to the top of the file:
Run the following SQL query on your MySQL database to create a GraphQL schema for a new relational database API.
$ cdk –version
Make sure the results are written to a.csv file with column headers included, and change <database-name> to the name of your schema, database, or both.
Run the following command, substituting the path to the.csv file prepared in the previous step for <path-schema.csv>.
$ npx @aws-amplify/cli api generate-schema \
–sql-schema <path-to-schema.csv> \
–engine-type mysql –out lib/schema.sql.graphql
To view the imported data model from your MySQL database schema, open the schema.sql.graphql file.
If you haven’t already, establish a parameter for your database’s connection information, including hostname/url, database name, port, username, and password, in the AWS Systems Manager console’s Parameter Store. To properly connect to your database and run GraphQL queries or modifications against it, Amplify will need these in the following step.
To define a new GraphQL API, add the following code to the main stack class. Put the parameter paths that were made in the previous step in lieu of the dbConnectionConfg options.
This setting assumes that you can access your database online. Additionally, on all models, the sandbox mode is enabled to permit public access, and the default authorization mode is set to Api Key for AWS AppSync. You can use this to test your API before implementing more detailed authorization restrictions.
Lastly, launch your GraphQL API on the Amazon Cloud
Select the Queries menu along with your project. The newly developed GraphQL APIs, like getMeals to retrieve a single item or listRestaurants to list all products, are compatible with your MySQL database tables.
like instance, a new GraphQL query appears when you pick objects that have fields like address, city, name, phone number, and so on. You may view the query results from your MySQL database by selecting the Run button.
You get identical results when you run a query on your MySQL database.
Currently accessible
Any MySQL and PostgreSQL databases hosted anywhere within an Amazon VPC or even outside of the AWS Cloud are now compatible with the relational database support for AWS Amplify.
Read more on Govindhtech.com
#aws#mysql#postgresql#api#GraphQLAPI#database#CDK#VPC#cloudcomputing#technology#technews#govindhtech
0 notes
Text
#WDILTW – Creating examples can be hard
This week I was evaluating AWS QLDB. Specifically the verifiable history of changes to determine how to simplify present processes that perform auditing via CDC. This is not the first time I have looked at QLDB so there was nothing that new to learn. What I found was that creating a workable solution with an existing application is hard. Even harder is creating an example to publish in this blog (and the purpose of this post). First some background. Using MySQL as the source of information, how can you leverage QLDB? It’s easy to stream data from MySQL Aurora, and it’s easy to stream data from QLDB, but it not that easy to place real-time data into QLDB. AWS DMS is a good way to move data from a source to a target, previously my work has included MySQL to MySQL, MySQL to Redshift, and MySQL to Kinesis, however there is no QLDB target. Turning the problem upside down, and using QLDB as the source of information, and streaming to MySQL for compatibility seemed a way forward. After setting up the QLDB Ledger and an example table, it was time to populate with existing data. The documented reference example looked very JSON compatible. Side bar, it is actually Amazon Ion a superset of JSON. INSERT INTO Person Now, MySQL offers with the X Protocol. This is something that lefred has evangelized for many years, I have seen presented many times, but finally I had a chance to use. The MySQL Shell JSON output looked ideal. { "ID": 1523, "Name": "Wien", "CountryCode": "AUT", "District": "Wien", "Info": { "Population": 1608144 } } { "ID": 1524, "Name": "Graz", "CountryCode": "AUT", "District": "Steiermark", "Info": { "Population": 240967 } } And now, onto some of the things I learned this week. Using AWS RDS Aurora MySQL is the first stumbling block, X Protocol is not supported. As this was a example, simple, mysqldump some reference data and load it into a MySQL 8 instance, and extract into JSON, so as to potentially emulate a pipeline. Here is my experiences of trying to refactor into a demo to write up. Launch a MySQL Docker container as per my standard notes. Harmless, right? MYSQL_ROOT_PASSWORD="$(date | md5sum | cut -c1-20)#" echo $MYSQL_ROOT_PASSWORD docker run --name=qldb-mysql -p3306:3306 -v mysql-volume:/var/lib/mysql -e MYSQL_ROOT_PASSWORD=$MYSQL_ROOT_PASSWORD -d mysql/mysql-server:latest docker logs qldb-mysql docker exec -it qldb-mysql /bin/bash As it's a quick demo, I shortcut credentials to make using the mysql client easier. NOTE: as I always generate a new password each container, it's included here. # echo "[mysql] user=root password='ab6ea7b0436cbc0c0d49#' > .my.cnf # mysql ERROR 1045 (28000): Access denied for user 'root'@'localhost' (using password: NO) What the? Did I make a mistake, I test manually and check # mysql -u root -p # cat .my.cnf Nothing wrong there. Next check # pwd / bash-4.2# grep root /etc/passwd root:x:0:0:root:/root:/bin/bash operator:x:11:0:operator:/root:/sbin/nologin And there is the first Dockerism. I don't live in Docker, so these 101 learnings would be known. First I really thing using "root" by default is a horrible idea. And when you shell in, you are not dropped into the home directory? Solved, we move on. # mv /.my.cnf /root/.my.cnf Mock and example as quickly as I can think. # mysql mysql> create schema if not exists demo; Query OK, 1 row affected (0.00 sec) mysql> use demo; Database changed mysql> create table sample(id int unsigned not null auto_increment, name varchar(30) not null, location varchar(30) not null, domain varchar(50) null, primary key(id)); Query OK, 0 rows affected (0.03 sec) mysql> show create table sample; mysql> insert into sample values (null,'Demo Row','USA',null), (null,'Row 2','AUS','news.com.au'), (null,'Kiwi','NZ', null); Query OK, 3 rows affected (0.00 sec) Records: 3 Duplicates: 0 Warnings: 0 mysql> select * from sample; +----+----------+----------+-------------+ | id | name | location | domain | +----+----------+----------+-------------+ | 1 | Demo Row | USA | NULL | | 2 | Row 2 | AUS | news.com.au | | 3 | Kiwi | NZ | NULL | +----+----------+----------+-------------+ 3 rows in set (0.00 sec) Cool, now to look at it in Javascript using MySQL Shell. Hurdle 2. # mysqlsh MySQL Shell 8.0.22 Copyright (c) 2016, 2020, Oracle and/or its affiliates. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. MySQL JS > var session=mysqlx.getSession('root:ab6ea7b0436cbc0c0d49#@localhost') mysqlx.getSession: Argument #1: Invalid URI: Illegal character [#] found at position 25 (ArgumentError) What the, it doesn't like the password format. I'm not a Javascript person, and well this is an example for blogging, which is not what was actually setup, so do it the right way, create a user. # mysql mysql> create user demo@localhost identified by 'qldb'; Query OK, 0 rows affected (0.01 sec) mysql> grant ALL ON sample.* to demo@localhost; Query OK, 0 rows affected, 1 warning (0.01 sec) mysql> SHOW GRANTS FOR demo@localhost; +----------------------------------------------------------+ | Grants for demo@localhost | +----------------------------------------------------------+ | GRANT USAGE ON *.* TO `demo`@`localhost` | | GRANT ALL PRIVILEGES ON `sample`.* TO `demo`@`localhost` | +----------------------------------------------------------+ 2 rows in set (0.00 sec) Back into the MySQL Shell, and hurdle 3. MySQL JS > var session=mysqlx.getSession('demo:qldb@localhost') mysqlx.getSession: Access denied for user 'demo'@'127.0.0.1' (using password: YES) (MySQL Error 1045) Did I create the creds wrong, verify. No my password is correct. # mysql -udemo -pqldb -e "SELECT NOW()" mysql: [Warning] Using a password on the command line interface can be insecure. +---------------------+ | NOW() | +---------------------+ | 2021-03-06 23:15:26 | +---------------------+ I don't have time to debug this, User take 2. mysql> drop user demo@localhost; Query OK, 0 rows affected (0.00 sec) mysql> create user demo@'%' identified by 'qldb'; Query OK, 0 rows affected (0.01 sec) mysql> grant all on demo.* to demo@'%' -> ; Query OK, 0 rows affected (0.00 sec) mysql> show grants; +-- | Grants for root@localhost | +--- | GRANT SELECT, INSERT, UPDATE, DELETE, CREATE, DROP, RELOAD, SHUTDOWN, PROCESS, FILE, REFERENCES, INDEX, ALTER, SHOW DATABASES, SUPER, CREATE TEMPORARY TABLES, LOCK TABLES, EXECUTE, REPLICATION SLAVE, REPLICATION CLIENT, CREATE VIEW, SHOW VIEW, CREATE ROUTINE, ALTER ROUTINE, CREATE USER, EVENT, TRIGGER, CREATE TABLESPACE, CREATE ROLE, DROP ROLE ON *.* TO `root`@`localhost` WITH GRANT OPTION | | GRANT APPLICATION_PASSWORD_ADMIN,AUDIT_ADMIN,BACKUP_ADMIN,BINLOG_ADMIN,BINLOG_ENCRYPTION_ADMIN,CLONE_ADMIN,CONNECTION_ADMIN,ENCRYPTION_KEY_ADMIN,FLUSH_OPTIMIZER_COSTS,FLUSH_STATUS,FLUSH_TABLES,FLUSH_USER_RESOURCES,GROUP_REPLICATION_ADMIN,INNODB_REDO_LOG_ARCHIVE,INNODB_REDO_LOG_ENABLE,PERSIST_RO_VARIABLES_ADMIN,REPLICATION_APPLIER,REPLICATION_SLAVE_ADMIN,RESOURCE_GROUP_ADMIN,RESOURCE_GROUP_USER,ROLE_ADMIN,SERVICE_CONNECTION_ADMIN,SESSION_VARIABLES_ADMIN,SET_USER_ID,SHOW_ROUTINE,SYSTEM_USER,SYSTEM_VARIABLES_ADMIN,TABLE_ENCRYPTION_ADMIN,XA_RECOVER_ADMIN ON *.* TO `root`@`localhost` WITH GRANT OPTION | | GRANT PROXY ON ''@'' TO 'root'@'localhost' WITH GRANT OPTION | +--- 3 rows in set (0.00 sec) mysql> show grants for demo@'%'; +--------------------------------------------------+ | Grants for demo@% | +--------------------------------------------------+ | GRANT USAGE ON *.* TO `demo`@`%` | | GRANT ALL PRIVILEGES ON `demo`.* TO `demo`@`%` | +--------------------------------------------------+ 2 rows in set (0.00 sec) Right, initially I showed grants of not new user, but note to self, I should checkout the MySQL 8 Improved grants. I wonder how RDS MySQL 8 handles these, and how Aurora MySQL 8 will (when it ever drops, that's another story). Third try is a charm, so nice to also see queries with 0.0000 execution granularity. MySQL JS > var session=mysqlx.getSession('demo:qldb@localhost') MySQL JS > var sql='SELECT * FROM demo.sample' MySQL JS > session.sql(sql) +----+----------+----------+-------------+ | id | name | location | domain | +----+----------+----------+-------------+ | 1 | Demo Row | USA | NULL | | 2 | Row 2 | AUS | news.com.au | | 3 | Kiwi | NZ | NULL | +----+----------+----------+-------------+ 3 rows in set (0.0006 sec) Get that now in JSON output. NOTE: There are 3 different JSON formats, this matched what I needed. bash-4.2# mysqlsh MySQL Shell 8.0.22 Copyright (c) 2016, 2020, Oracle and/or its affiliates. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. Type 'help' or '?' for help; 'quit' to exit. MySQL JS > var session=mysqlx.getSession('demo:qldb@localhost') MySQL JS > var sql='SELECT * FROM demo.sample' MySQL JS > shell.options.set('resultFormat','json/array') MySQL JS > session.sql(sql) [ {"id":1,"name":"Demo Row","location":"USA","domain":null}, {"id":2,"name":"Row 2","location":"AUS","domain":"news.com.au"}, {"id":3,"name":"Kiwi","location":"NZ","domain":null} ] 3 rows in set (0.0006 sec) Ok, that works in interactive interface, I need it scripted. # vi bash: vi: command not found # yum install vi Loaded plugins: ovl http://repo.mysql.com/yum/mysql-connectors-community/el/7/x86_64/repodata/repomd.xml: [Errno 14] HTTP Error 403 - Forbidden Trying other mirror. ... And another downer of Docker containers, other tools or easy ways to install them, again I want to focus on the actual example, and not all this preamble, so # echo "var session=mysqlx.getSession('demo:qldb@localhost') var sql='SELECT * FROM demo.sample' shell.options.set('resultFormat','json/array') session.sql(sql)" > dump.js # mysqlsh What the? Hurdle 4. Did I typo this as well, I check the file, and cut/paste it and get what I expect. # cat dump.js var session=mysqlx.getSession('demo:qldb@localhost') var sql='SELECT * FROM demo.sample' shell.options.set('resultFormat','json/array') session.sql(sql) # mysqlsh MySQL Shell 8.0.22 Copyright (c) 2016, 2020, Oracle and/or its affiliates. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. Type 'help' or '?' for help; 'quit' to exit. MySQL JS > var session=mysqlx.getSession('demo:qldb@localhost') MySQL JS > var sql='SELECT * FROM demo.sample' MySQL JS > shell.options.set('resultFormat','json/array') MySQL JS > session.sql(sql) [ {"id":1,"name":"Demo Row","location":"USA","domain":null}, {"id":2,"name":"Row 2","location":"AUS","domain":"news.com.au"}, {"id":3,"name":"Kiwi","location":"NZ","domain":null} ] 3 rows in set (0.0022 sec) This is getting crazy. # echo '[ > {"id":1,"name":"Demo Row","location":"USA","domain":null}, > {"id":2,"name":"Row 2","location":"AUS","domain":"news.com.au"}, > {"id":3,"name":"Kiwi","location":"NZ","domain":null} > ]' > sample.json bash-4.2# jq . sample.json bash: jq: command not found Oh the docker!!!!. Switching back to my EC2 instance now. $ echo '[ > {"id":1,"name":"Demo Row","location":"USA","domain":null}, > {"id":2,"name":"Row 2","location":"AUS","domain":"news.com.au"}, > {"id":3,"name":"Kiwi","location":"NZ","domain":null} > ]' > sample.json $ jq . sample.json [ { "id": 1, "name": "Demo Row", "location": "USA", "domain": null }, { "id": 2, "name": "Row 2", "location": "AUS", "domain": "news.com.au" }, { "id": 3, "name": "Kiwi", "location": "NZ", "domain": null } ] I am now way of the time I would like to spend on this weekly post, and it's getting way to long, and I'm nowhere near showing what I actually want. Still we trek on. Boy, this stock EC2 image uses version 1, we need I'm sure V2, and well command does not work!!!! $ aws qldb list-ledgers ERROR: $ aws --version $ curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" $ unzip awscliv2.zip $ sudo ./aws/install $ export PATH=/usr/local/bin:$PATH $ aws --version Can I finally get a ledger now. $ aws qldb create-ledger --name demo --tags JIRA=DEMO-5826,Owner=RonaldBradford --permissions-mode ALLOW_ALL --no-deletion-protection { "Name": "demo", "Arn": "arn:aws:qldb:us-east-1:999:ledger/demo", "State": "CREATING", "CreationDateTime": "2021-03-06T22:46:41.760000+00:00", "DeletionProtection": false } $ aws qldb list-ledgers { "Ledgers": [ { "Name": "xx", "State": "ACTIVE", "CreationDateTime": "2021-03-05T20:12:44.611000+00:00" }, { "Name": "demo", "State": "ACTIVE", "CreationDateTime": "2021-03-06T22:46:41.760000+00:00" } ] } $ aws qldb describe-ledger --name demo { "Name": "demo", "Arn": "arn:aws:qldb:us-east-1:999:ledger/demo", "State": "ACTIVE", "CreationDateTime": "2021-03-06T22:46:41.760000+00:00", "DeletionProtection": false } Oh the Python 2, and the lack of user packaging, more crud of getting an example. $ pip install pyqldb==3.1.0 ERROR $ echo "alias python=python3 alias pip=pip3" >> ~/.bash_profile source ~/.bash_profile $ pip --version pip 9.0.3 from /usr/lib/python3.6/site-packages (python 3.6) $ python --version Python 3.6.8 $ pip install pyqldb==3.1.0 ERROR $ sudo pip install pyqldb==3.1.0 Yeah!, after all that, my example code works and data is inserted. $ cat demo.py from pyqldb.config.retry_config import RetryConfig from pyqldb.driver.qldb_driver import QldbDriver # Configure retry limit to 3 retry_config = RetryConfig(retry_limit=3) # Initialize the driver print("Initializing the driver") qldb_driver = QldbDriver("demo", retry_config=retry_config) def create_table(transaction_executor, table): print("Creating table {}".format(table)) transaction_executor.execute_statement("Create TABLE {}".format(table)) def create_index(transaction_executor, table, column): print("Creating index {}.{}".format(table, column)) transaction_executor.execute_statement("CREATE INDEX ON {}({})".format(table,column)) def insert_record(transaction_executor, table, values): print("Inserting into {}".format(table)) transaction_executor.execute_statement("INSERT INTO {} ?".format(table), values) table="sample" column="id" qldb_driver.execute_lambda(lambda executor: create_table(executor, table)) qldb_driver.execute_lambda(lambda executor: create_index(executor, table, column)) record1 = { 'id': "1", 'name': "Demo Row", 'location': "USA", 'domain': "" } qldb_driver.execute_lambda(lambda x: insert_record(x, table, record1)) $ python demo.py Initializing the driver Creating table sample Creating index sample.id Inserting into sample One vets in the AWS Console, but you cannot show that in text in this blog, so goes to find a simple client and there is qldbshell What the? I installed it and it complains about pyqldb.driver.pooled_qldb_driver. I literally used that in the last example. $ pip3 install qldbshell Collecting qldbshell Downloading https://artifactory.lifion.oneadp.com/artifactory/api/pypi/pypi/packages/packages/0f/f7/fe984d797e0882c5e141a4888709ae958eb8c48007a23e94000507439f83/qldbshell-1.2.0.tar.gz (68kB) 100% |████████████████████████████████| 71kB 55.6MB/s Requirement already satisfied: boto3>=1.9.237 in /usr/local/lib/python3.6/site-packages (from qldbshell) Collecting amazon.ion=0.5.0 (from qldbshell) Downloading https://artifactory.lifion.oneadp.com/artifactory/api/pypi/pypi/packages/packages/4e/b7/21b7a7577cc6864d1c93fd710701e4764af6cf0f7be36fae4f9673ae11fc/amazon.ion-0.5.0.tar.gz (178kB) 100% |████████████████████████████████| 184kB 78.7MB/s Requirement already satisfied: prompt_toolkit=3.0.5 in /usr/local/lib/python3.6/site-packages (from qldbshell) Requirement already satisfied: ionhash~=1.1.0 in /usr/local/lib/python3.6/site-packages (from qldbshell) Requirement already satisfied: s3transfer=0.3.0 in /usr/local/lib/python3.6/site-packages (from boto3>=1.9.237->qldbshell) Requirement already satisfied: jmespath=0.7.1 in /usr/local/lib/python3.6/site-packages (from boto3>=1.9.237->qldbshell) Requirement already satisfied: botocore=1.20.21 in /usr/local/lib/python3.6/site-packages (from boto3>=1.9.237->qldbshell) Requirement already satisfied: six in /usr/local/lib/python3.6/site-packages (from amazon.ion=0.5.0->qldbshell) Requirement already satisfied: wcwidth in /usr/local/lib/python3.6/site-packages (from prompt_toolkit=3.0.5->qldbshell) Requirement already satisfied: python-dateutil=2.1 in /usr/local/lib/python3.6/site-packages (from botocore=1.20.21->boto3>=1.9.237->qldbshell) Requirement already satisfied: urllib3=1.25.4 in /usr/local/lib/python3.6/site-packages (from botocore=1.20.21->boto3>=1.9.237->qldbshell) Installing collected packages: amazon.ion, qldbshell Found existing installation: amazon.ion 0.7.0 Uninstalling amazon.ion-0.7.0: Exception: Traceback (most recent call last): File "/usr/lib64/python3.6/shutil.py", line 550, in move os.rename(src, real_dst) PermissionError: [Errno 13] Permission denied: '/usr/local/lib/python3.6/site-packages/amazon.ion-0.7.0-py3.6-nspkg.pth' -> '/tmp/pip-p8j4d45d-uninstall/usr/local/lib/python3.6/site-packages/amazon.ion-0.7.0-py3.6-nspkg.pth' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/site-packages/pip/basecommand.py", line 215, in main status = self.run(options, args) File "/usr/lib/python3.6/site-packages/pip/commands/install.py", line 365, in run strip_file_prefix=options.strip_file_prefix, File "/usr/lib/python3.6/site-packages/pip/req/req_set.py", line 783, in install requirement.uninstall(auto_confirm=True) File "/usr/lib/python3.6/site-packages/pip/req/req_install.py", line 754, in uninstall paths_to_remove.remove(auto_confirm) File "/usr/lib/python3.6/site-packages/pip/req/req_uninstall.py", line 115, in remove renames(path, new_path) File "/usr/lib/python3.6/site-packages/pip/utils/__init__.py", line 267, in renames shutil.move(old, new) File "/usr/lib64/python3.6/shutil.py", line 565, in move os.unlink(src) PermissionError: [Errno 13] Permission denied: '/usr/local/lib/python3.6/site-packages/amazon.ion-0.7.0-py3.6-nspkg.pth' [centos@ip-10-204-101-224] ~ $ sudo pip3 install qldbshell WARNING: Running pip install with root privileges is generally not a good idea. Try `pip3 install --user` instead. Collecting qldbshell Downloading https://artifactory.lifion.oneadp.com/artifactory/api/pypi/pypi/packages/packages/0f/f7/fe984d797e0882c5e141a4888709ae958eb8c48007a23e94000507439f83/qldbshell-1.2.0.tar.gz (68kB) 100% |████████████████████████████████| 71kB 49.8MB/s Requirement already satisfied: boto3>=1.9.237 in /usr/local/lib/python3.6/site-packages (from qldbshell) Collecting amazon.ion=0.5.0 (from qldbshell) Downloading https://artifactory.lifion.oneadp.com/artifactory/api/pypi/pypi/packages/packages/4e/b7/21b7a7577cc6864d1c93fd710701e4764af6cf0f7be36fae4f9673ae11fc/amazon.ion-0.5.0.tar.gz (178kB) 100% |████████████████████████████████| 184kB 27.7MB/s Requirement already satisfied: prompt_toolkit=3.0.5 in /usr/local/lib/python3.6/site-packages (from qldbshell) Requirement already satisfied: ionhash~=1.1.0 in /usr/local/lib/python3.6/site-packages (from qldbshell) Requirement already satisfied: botocore=1.20.21 in /usr/local/lib/python3.6/site-packages (from boto3>=1.9.237->qldbshell) Requirement already satisfied: jmespath=0.7.1 in /usr/local/lib/python3.6/site-packages (from boto3>=1.9.237->qldbshell) Requirement already satisfied: s3transfer=0.3.0 in /usr/local/lib/python3.6/site-packages (from boto3>=1.9.237->qldbshell) Requirement already satisfied: six in /usr/local/lib/python3.6/site-packages (from amazon.ion=0.5.0->qldbshell) Requirement already satisfied: wcwidth in /usr/local/lib/python3.6/site-packages (from prompt_toolkit=3.0.5->qldbshell) Requirement already satisfied: python-dateutil=2.1 in /usr/local/lib/python3.6/site-packages (from botocore=1.20.21->boto3>=1.9.237->qldbshell) Requirement already satisfied: urllib3=1.25.4 in /usr/local/lib/python3.6/site-packages (from botocore=1.20.21->boto3>=1.9.237->qldbshell) Installing collected packages: amazon.ion, qldbshell Found existing installation: amazon.ion 0.7.0 Uninstalling amazon.ion-0.7.0: Successfully uninstalled amazon.ion-0.7.0 Running setup.py install for amazon.ion ... done Running setup.py install for qldbshell ... done Successfully installed amazon.ion-0.5.0 qldbshell-1.2.0 $ sudo pip3 install qldbshell $ qldbshell Traceback (most recent call last): File "/usr/local/bin/qldbshell", line 11, in load_entry_point('qldbshell==1.2.0', 'console_scripts', 'qldbshell')() File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 476, in load_entry_point return get_distribution(dist).load_entry_point(group, name) File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2700, in load_entry_point return ep.load() File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2318, in load return self.resolve() File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2324, in resolve module = __import__(self.module_name, fromlist=['__name__'], level=0) File "/usr/local/lib/python3.6/site-packages/qldbshell/__main__.py", line 25, in from pyqldb.driver.pooled_qldb_driver import PooledQldbDriver ModuleNotFoundError: No module named 'pyqldb.driver.pooled_qldb_driver' $ pip list qldbshell DEPRECATION: The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning. amazon.ion (0.5.0) boto3 (1.17.21) botocore (1.20.21) ionhash (1.1.0) jmespath (0.10.0) pip (9.0.3) prompt-toolkit (3.0.16) pyqldb (3.1.0) python-dateutil (2.8.1) qldbshell (1.2.0) s3transfer (0.3.4) setuptools (39.2.0) six (1.15.0) urllib3 (1.26.3) So, uninstalled and re-installed and voila, my data. $ qldbshell usage: qldbshell [-h] [-v] [-s QLDB_SESSION_ENDPOINT] [-r REGION] [-p PROFILE] -l LEDGER qldbshell: error: the following arguments are required: -l/--ledger $ qldbshell -l demo Welcome to the Amazon QLDB Shell version 1.2.0 Use 'start' to initiate and interact with a transaction. 'commit' and 'abort' to commit or abort a transaction. Use 'start; statement 1; statement 2; commit; start; statement 3; commit' to create transactions non-interactively. Use 'help' for the help section. All other commands will be interpreted as PartiQL statements until the 'exit' or 'quit' command is issued. qldbshell > qldbshell > SELECT * FROM sample; INFO: { id: "1", name: "Demo Row", location: "USA", domain: "" } INFO: (0.1718s) qldbshell > q WARNING: Error while executing query: An error occurred (BadRequestException) when calling the SendCommand operation: Lexer Error: at line 1, column 1: invalid character at, '' [U+5c]; INFO: (0.1134s) qldbshell > exit Exiting QLDB Shell Right q is a mysqlism of the client, need to rewire myself. Now, I have a ledger, I created an example table, mocked a row of data and verified. Now I can just load my sample data in JSON I created earlier right? Wrong!!! $ cat load.py import json from pyqldb.config.retry_config import RetryConfig from pyqldb.driver.qldb_driver import QldbDriver # Configure retry limit to 3 retry_config = RetryConfig(retry_limit=3) # Initialize the driver print("Initializing the driver") qldb_driver = QldbDriver("demo", retry_config=retry_config) def insert_record(transaction_executor, table, values): print("Inserting into {}".format(table)) transaction_executor.execute_statement("INSERT INTO {} ?".format(table), values) table="sample" with open('sample.json') as f: data=json.load(f) qldb_driver.execute_lambda(lambda x: insert_record(x, table, data)) $ python load.py Traceback (most recent call last): File "load.py", line 2, in from pyqldb.config.retry_config import RetryConfig ModuleNotFoundError: No module named 'pyqldb' [centos@ip-10-204-101-224] ~ Oh sweet, I'd installed that, and used it, and re-installed it. $ pip list | grep pyqldb DEPRECATION: The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning. [centos@ip-10-204-101-224] ~ $ sudo pip3 install pyqldb WARNING: Running pip install with root privileges is generally not a good idea. Try `pip3 install --user` instead. Collecting pyqldb Downloading https://artifactory.lifion.oneadp.com/artifactory/api/pypi/pypi/packages/packages/5c/b4/9790b1fad87d7df5b863cbf353689db145bd009d31d854d282b31e1c1781/pyqldb-3.1.0.tar.gz Collecting amazon.ion=0.7.0 (from pyqldb) Downloading https://artifactory.lifion.oneadp.com/artifactory/api/pypi/pypi/packages/packages/7d/ac/fd1edee54cefa425c444b51ad00a20e5bc74263a3afbfd4c8743040f8f26/amazon.ion-0.7.0.tar.gz (211kB) 100% |████████████████████████████████| 215kB 24.8MB/s Requirement already satisfied: boto3=1.16.56 in /usr/local/lib/python3.6/site-packages (from pyqldb) Requirement already satisfied: botocore=1.19.56 in /usr/local/lib/python3.6/site-packages (from pyqldb) Requirement already satisfied: ionhash=1.1.0 in /usr/local/lib/python3.6/site-packages (from pyqldb) Requirement already satisfied: six in /usr/local/lib/python3.6/site-packages (from amazon.ion=0.7.0->pyqldb) Requirement already satisfied: s3transfer=0.3.0 in /usr/local/lib/python3.6/site-packages (from boto3=1.16.56->pyqldb) Requirement already satisfied: jmespath=0.7.1 in /usr/local/lib/python3.6/site-packages (from boto3=1.16.56->pyqldb) Requirement already satisfied: python-dateutil=2.1 in /usr/local/lib/python3.6/site-packages (from botocore=1.19.56->pyqldb) Requirement already satisfied: urllib3=1.25.4 in /usr/local/lib/python3.6/site-packages (from botocore=1.19.56->pyqldb) Installing collected packages: amazon.ion, pyqldb Found existing installation: amazon.ion 0.5.0 Uninstalling amazon.ion-0.5.0: Successfully uninstalled amazon.ion-0.5.0 Running setup.py install for amazon.ion ... done Running setup.py install for pyqldb ... done Successfully installed amazon.ion-0.7.0 pyqldb-3.1.0 Load one more time. $ cat load.py import json from pyqldb.config.retry_config import RetryConfig from pyqldb.driver.qldb_driver import QldbDriver # Configure retry limit to 3 retry_config = RetryConfig(retry_limit=3) # Initialize the driver print("Initializing the driver") qldb_driver = QldbDriver("demo", retry_config=retry_config) def insert_record(transaction_executor, table, values): print("Inserting into {}".format(table)) transaction_executor.execute_statement("INSERT INTO {} ?".format(table), values) table="sample" with open('sample.json') as f: data=json.load(f) qldb_driver.execute_lambda(lambda x: insert_record(x, table, data)) $ python load.py Initializing the driver Inserting into sample And done, I've got my JSON extracted MySQL 8 data in QLDB. I go to vett it in the client, and boy, didn't expect yet another package screw up. Clearly, these 2 AWS python packages are incompatible. That's a venv need, but I'm now at double my desired time to show this. $ qldbshell -l demo Traceback (most recent call last): File "/usr/local/bin/qldbshell", line 11, in load_entry_point('qldbshell==1.2.0', 'console_scripts', 'qldbshell')() File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 476, in load_entry_point return get_distribution(dist).load_entry_point(group, name) File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2700, in load_entry_point return ep.load() File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2318, in load return self.resolve() File "/usr/lib/python3.6/site-packages/pkg_resources/__init__.py", line 2324, in resolve module = __import__(self.module_name, fromlist=['__name__'], level=0) File "/usr/local/lib/python3.6/site-packages/qldbshell/__main__.py", line 25, in from pyqldb.driver.pooled_qldb_driver import PooledQldbDriver ModuleNotFoundError: No module named 'pyqldb.driver.pooled_qldb_driver' [centos@ip-10-204-101-224] ~ $ pip list | grep qldbshell DEPRECATION: The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning. qldbshell (1.2.0) $ sudo pip uninstall qldbshell pyqldb $ sudo pip install qldbshell WARNING: Running pip install with root privileges is generally not a good idea. Try `pip3 install --user` instead. Collecting qldbshell Downloading https://artifactory.lifion.oneadp.com/artifactory/api/pypi/pypi/packages/packages/0f/f7/fe984d797e0882c5e141a4888709ae958eb8c48007a23e94000507439f83/qldbshell-1.2.0.tar.gz (68kB) 100% |████████████████████████████████| 71kB 43.4MB/s Requirement already satisfied: boto3>=1.9.237 in /usr/local/lib/python3.6/site-packages (from qldbshell) Requirement already satisfied: amazon.ion=0.5.0 in /usr/local/lib/python3.6/site-packages (from qldbshell) Requirement already satisfied: prompt_toolkit=3.0.5 in /usr/local/lib/python3.6/site-packages (from qldbshell) Requirement already satisfied: ionhash~=1.1.0 in /usr/local/lib/python3.6/site-packages (from qldbshell) Requirement already satisfied: s3transfer=0.3.0 in /usr/local/lib/python3.6/site-packages (from boto3>=1.9.237->qldbshell) Requirement already satisfied: botocore=1.20.21 in /usr/local/lib/python3.6/site-packages (from boto3>=1.9.237->qldbshell) Requirement already satisfied: jmespath=0.7.1 in /usr/local/lib/python3.6/site-packages (from boto3>=1.9.237->qldbshell) Requirement already satisfied: six in /usr/local/lib/python3.6/site-packages (from amazon.ion=0.5.0->qldbshell) Requirement already satisfied: wcwidth in /usr/local/lib/python3.6/site-packages (from prompt_toolkit=3.0.5->qldbshell) Requirement already satisfied: python-dateutil=2.1 in /usr/local/lib/python3.6/site-packages (from botocore=1.20.21->boto3>=1.9.237->qldbshell) Requirement already satisfied: urllib3=1.25.4 in /usr/local/lib/python3.6/site-packages (from botocore=1.20.21->boto3>=1.9.237->qldbshell) Installing collected packages: qldbshell Running setup.py install for qldbshell ... done Successfully installed qldbshell-1.2.0 Can I see my data now $ qldbshell -l demo Welcome to the Amazon QLDB Shell version 1.2.0 Use 'start' to initiate and interact with a transaction. 'commit' and 'abort' to commit or abort a transaction. Use 'start; statement 1; statement 2; commit; start; statement 3; commit' to create transactions non-interactively. Use 'help' for the help section. All other commands will be interpreted as PartiQL statements until the 'exit' or 'quit' command is issued. qldbshell > select * from sample; INFO: { id: 1, name: "Demo Row", location: "USA", domain: null }, { id: 1, name: "Demo Row", location: "USA", domain: null }, { id: "1", name: "Demo Row", location: "USA", domain: "" }, { id: 3, name: "Kiwi", location: "NZ", domain: null }, { id: 2, name: "Row 2", location: "AUS", domain: "news.com.au" }, { id: 3, name: "Kiwi", location: "NZ", domain: null }, { id: 2, name: "Row 2", location: "AUS", domain: "news.com.au" } INFO: (0.0815s) And yes, data, I see it's duplicated, so I must have in between the 10 steps run twice. This does highlight a known limitation of QLDB, no unique constraints. But wait, that data is not really correct, I don't want null. Goes back to the JSON to see the MySQL shell gives that. $ jq . sample.json [ { "id": 1, "name": "Demo Row", "location": "USA", "domain": null }, ... At some point I also got this load error, but by now I've given up documenting how to do something, in order to demonstrate something. NameError: name 'null' is not defined One has to wrap the only nullable column with IFNULL(subdomain,'') as subdomain and redo all those steps again. This is not going to be practical having to wrap all columns in a wider table with IFNULL. However, having exhausted all this time for what was supposed to be a quiet weekend few hours, my post is way to long, and I've learned "Creating examples can be hard". http://ronaldbradford.com/blog/wdiltw-creating-examples-can-be-hard-2021-03-06/
0 notes
Text
Global Atomic Spectroscopy Market Growth, Significant Trends Till 2024
Data Bridge Market Research new Insight’s on “Global Atomic Spectroscopy Market, is projected to reach USD 8.09 billion by 2024, from USD 4.66 billion in 2016, at a CAGR of 7.2% during the forecast period from 2017 to 2024.”
Get free sample @ https://databridgemarketresearch.com/request-a-sample/?dbmr=global-atomic-spectroscopy-market
The market is segmented By Type (Instruments (Atomic Absorption Spectrometer, X-Ray Fluorescence Spectrometer, X-Ray Diffraction Spectrometer, Inductively Coupled Plasma Mass Spectrometer (ICP-MS), Inductively Coupled Plasma (ICP) Spectrometer, Others (LIBS, MIP-OES)), Reagents), By Application (Food and Beverage Testing, Pharmaceutical, Industrial, Environmental Testing, Geological Sciences, Petrochemical, Academics, Others), By End Users (Laboratories, Universities, Manufacturing Facilities, Government Agencies), By Distribution Channel (Direct Tenders, Retail), By Geography (North America, Europe, Asia-Pacific, South America, Middle East and Africa) – Industry Trends and Forecast to 2024
The actual calculation year for the report in 2016, while the historic data is for 2015 and the forecast period is still 2024.
Atomic Spectroscopy Market: Recent Developments
Atomic Spectroscopy involves the process of interaction of light with gaseous atoms. Spectrometer is a device that helps in converting sample gaseous atoms in an atom cell which can be flamed. Atomic Spectroscopy is applied to determine elemental composition. Atomic absorption spectrometers are one of the most commonly sold and used analytical devices. Drug discovery and development, metabolomics, and diagnostics are some of the applications using atomic spectroscopy.
Inquire before buying @ https://databridgemarketresearch.com/inquire-before-buying/?dbmr=global-atomic-spectroscopy-market
Atomic Spectroscopy Market Categories: Segmentation
The Global Atomic Spectroscopy Market is segmented on the basis of type, application, end-users, distribution channel and geography.
On the basis of type the global atomic spectroscopy is segmented into instruments and reagents. The instruments market sub-segment is further segmented into atomic absorption spectrometers, X-ray fluorescence spectrometers, X-ray diffraction spectrometers, inductively coupled plasma mass spectrometers (ICP-MS), inductively coupled plasma (ICP) spectrometers, and others.
Based on distribution channel the global atomic spectroscopy market is segmented into Direct Tenders and Retail.
Based on application the global atomic spectroscopy market is segmented into food and beverage testing, pharmaceutical, industrial, environmental testing, geological sciences, petrochemical, academics and, others.
On the basis of end-user the market is segmented into Laboratories, Universities, Manufacturing Facilities, and Government Agencies.
Atomic Spectroscopy Market: Geographic Segmentation
Based on geography the market is segmented into 7 geographical regions-
· North America,
· Latin America,
· Europe,
· Asia-Pacific,
· Japan and
· Middle East and Africa.
The geographical regions are further segmented into major countries such as
· U.S.
· Canada,
· Brazil,
· Argentina,
· Nordics,
· Benelux,
· Mexico,
· Germany,
· France,
· U.K.,
· Belgium,
· Switzerland,
· Belgium,
· Turkey,
· Japan,
· China,
· Singapore,
· Australia
· New Zealand,
· Brazil,
· India,
· Russia,
· South Africa and many others.
Atomic Spectroscopy Market: Company Share Analysis
There are a large number of players operating in this market, especially in the consumer devices market. Some of the major players in the global atomic spectroscopy market are Thermo Fisher Scientific Inc., Agilent Technologies, PerkinElmer, Inc., Bruker , Rigaku Corporation, Aurora Biomed, Buck Scientific, Shimadzu Corporation, Analytik Jena AG, GBC Scientific Equipment, Labnics Equipment, SAFAS Monaco, Hitachi High-Technologies Corporation, Avantor Performance Materials and, Sigma-Aldrich Co. LLC among others.
Access Full Report : https://databridgemarketresearch.com/reports/global-atomic-spectroscopy-market-industry-trends-2024/
About Us:
Data Bridge Market Research set forth itself as an unconventional and neoteric Market research and consulting firm with unparalleled level of resilience and integrated approaches. We are determined to unearth the best market opportunities and foster efficient information for your business to thrive in the market. Data Bridge endeavors to provide appropriate solutions to the complex business challenges and initiates an effortless decision-making process.
Contact:
Data Bridge Market Research
Tel: +1-888-387-2818
Email: [email protected]
#Global Atomic Spectroscopy Market report#Global Atomic Spectroscopy Market trend#Global Atomic Spectroscopy Market size#Global Atomic Spectroscopy Market competitors#Global Atomic Spectroscopy Market industry trends#Global Atomic Spectroscopy Market opportunities
0 notes
Text



@soul100 AFTER SO LONG FINALLY MADE MORE SILLY SILLY INTERACTIONS !!!!!
She's very clumsy, be friends with aurora at your own risk/j
Also this is one of the only time you'll see aurora actually being hurt lol >:3 (and lib is too it's not even his fault at all lol)
Also it's late so sorry that it's kinda bad- but I'm still kinda happy with it
#now she gotta clean her mess#I love how clumsy she is (and that's something I didn't show yet so that's cool !)#so happy to draw lib again >:3#it's cool that I don't need any refs/pictures to draw him anymore#cuz I drew him alot#I should show some doodles I did while soul where gone#but I keep forgetting#can't to see what soul will think ajdgsjgdjsg#she's probally sleepin#good nighttt#I'm going to sleep too !#GOOD NIGHT ALLLL !! 💫#<3#lib and aurora interacts#woooooo#good night <3
29 notes
·
View notes
Text

HERE'S THE OTHER DOODLES !
LOOK MY DAUGHTER IS READING A BOOK ABOUT NATURE ON A BEAN BAG !! ...
That's all I had to say-

Here is if ink and error went to the library :3
(btw on the books there's ''do you have adhd'' for ink and ''do you have anger issues'' for error (that's a joke) )

AND LIB HAAA I LOVE HIMM
I think he's really one of my fav sans, I love him SO much sgsgfs
Librarytale and lib sans belong to @soul100
Aurora belong to me
Ink belong to comyet,
and error belong to loverofpiggies (idk if I spelled it right)
#lib sans#library!sans#library sans#lib#lib and aurora interacts#ink sans#ink#utmv ink#error!sans#error#error sans#ink!sans#utmv error#undertale#utmv art#utmv#aurora sans#aurora!sans#aurora
29 notes
·
View notes
Text

I had to draw them, the design is so cool gsgsfsfs
It's a fusion of Aurora and lib that soul made, and it's so cool so I had to draw it before going to sleep :3
Lib and luminary's design (still not sure if their name is that but ig I'll tag them as that for now and change if it's not) belong to @soul100 ^^
Also the background is random (but I still like it) and I can't draw windows
#lib and aurora interacts#wait I'm having a doubt#should I really tag it as that ?#huhh#idk#undertalleau#ut au#I think soul is sleeping rn but she'll see when walking up ig >:3#utmv#aurora!sans#aurora sans#art#my art#lib sans#library!sans#library sans#lib#fusion#aurora and lib fusion#luminary#idk if I should tag it as that but ->#luminary!aurora#?#huhh I guess ?#luminary!lib#idk if this works or not idk scsgsf nevermind idk haa-#ANYWAYS#huhhmm I'm going to sleep now#good night guys ! <3
21 notes
·
View notes
Note
DAY huhhh 6 OF SENDING YOU ARTS BEFORE GOING TO SLEEP
Good nighhttt
(there's: ( I have no ideas of what to do anymore btw) (nah I'll find some ideas))



well, if you need ideas, you can always rely on me! I came up with this on the spot. You can think of it as a dreamtale version of those two sillies..? or not 🤷
sorry, they are drawn very poorly, but I think it communicates my idea well enough ^^
on the last one they have lollipops! :D
goodnight!!! sleep well!! 💫
(gosh, I'm such a fan of how you draw expressions, you CAN NOT imagine T.T)
#soul100 art#not reblog#goodnight from aly#soul100 asks#aurora starlight silly#aurora sans#lib sans#lib and aurora interacts#utmv#utdrmv
8 notes
·
View notes
Text
@soul100 this is what happends if lib tells Aurora that he doesn't get alot of compliments lol :3
Based on this
I'm so normal abt them that I drew this in the car (I love drawing theemmm)
#I have no idea why I made his head like this#I just trought it'd be funny lol#I love that Aurora is a lil bit taller than lib :D#yayyy#soul100#aurora sans#aurora!sans#lib and aurora interacts#library!sans#lib sans#library sans#lib#:3
8 notes
·
View notes
Text
HAA GUYS LOOK THIS IS ADORABLE DCSFFS YAYYYY




@aurora-starlight-silly silly silly interaction!!
I'll draw another one soon ;)
the joke is not that funny, but I still like how this little comic turned out ^^
26 notes
·
View notes
Text
Global Atomic Spectroscopy Market Trends, Outlook 2017-2024
Data Bridge Market Research new Insight’s on “Global Atomic Spectroscopy Market, is projected to reach USD 8.09 billion by 2024, from USD 4.66 billion in 2016, at a CAGR of 7.2% during the forecast period from 2017 to 2024.”
The market is segmented by Type [Instruments {Atomic Absorption, X-Ray Fluorescence, X-Ray Diffraction, (ICP-MS), (ICP), Others (LIBS, MIP-OES)}, Reagents] Application (Food and Beverage Testing, Pharmaceutical, Industrial, Environmental Testing, Geological Sciences, Petrochemical, Academics, Others) End Users (Laboratories, Universities, Manufacturing Facilities, Government Agencies) Distribution Channel (Direct Tenders, Retail) Geography (North America, Europe, APAC, South America, MEA) – Industry Trends and Forecast to 2024.
Request for free sample @ https://databridgemarketresearch.com/request-a-sample/?dbmr=global-atomic-spectroscopy-market-2
The actual calculation year for the report iS 2016, while the historic data is for 2015 and the forecast period is still 2024.
Segmentation: Global Atomic Spectroscopy Market Categories
· The Global Atomic Spectroscopy Market is segmented based on product type, application, end-user, distribution channel, and geography.
· On the basis of type, the global atomic spectroscopy is segmented into instruments and reagents.
· Based on the application, the market is segmented into food and beverage testing, pharmaceutical, industrial, environmental testing, geological sciences, petrochemical, academics, and others.
· On the basis of end-users, the market is segmented into laboratories, universities, manufacturing facilities, and government agencies.
· Based on distribution channel, the market is segmented into direct tenders and retail.
Inquire before buying @ https://databridgemarketresearch.com/inquire-before-buying/?dbmr=global-atomic-spectroscopy-market-2
Geographic Segmentation: Global Atomic Spectroscopy Market
Based on geography, the market is segmented into five geographical regions-
· North America,
· Latin America,
· Europe,
· Asia-Pacific, and
· Middle East and Africa.
The geographical regions are further segmented into major countries such as
· U.S.
· Canada,
· Brazil,
· Argentina,
· Nordics,
· Benelux,
· Mexico,
· Germany,
· France,
· U.K.,
· Belgium,
· Switzerland,
· Belgium,
· Turkey,
· Japan,
· China,
· Singapore,
· Australia
· New Zealand,
· Brazil,
· India,
· Russia,
· South Africa and many others.
North America is expected to dominate the market for atomic spectroscopy owing to technological developments and high expenditure in research and development.
Recent Developments: Global Atomic Spectroscopy Market
· Increase in the penetration of technology in the R&D field, various global research organization increasing their demand for hyper sophisticated analytical instrument, increasing government support, increasing demand on the discovery of newer molecules by pharmaceutical and chemical organization, increasing international standard for food and drug safety are some market factors which drive the market for atomic spectroscopy. The most important driver for the market of atomic spectroscopy is rapid growth in R&D. On the other hand, huge expenditure for initial set-up and instrumentation, high maintenance cost, requirement of technically well-skilled personnel, unawareness in many countries are some of the major restraints of atomic spectroscopy.
· Atomic Spectroscopy involves the process of interaction of light with gaseous atoms. Spectrometer is a device that helps in converting sample gaseous atoms in an atom cell which can be flamed. Atomic Spectroscopy is applied to determine elemental composition. Atomic absorption spectrometers are one of the most commonly sold and used analytical devices. Drug discovery and development, metabolomics, and diagnostics are some of the applications that use atomic spectroscopy.
Company Share Analysis: Global Atomic Spectroscopy Market
The market report contains an in-depth profiling of the key market players, along with the recent developments (new product launches, partnerships, agreements, collaborations, and joint ventures) and strategies adopted by them to sustain and strengthen their position in the market. For example, on May 3rd, 2016 Agilent Technologies Inc., introduced the Agilent 5110 ICP-OES that would enable scientists to perform faster, more precise ICP-OES analysis in food, environmental and pharmaceutical testing as well as for mining and industrial applications. The market scope also covers company shares for atomic spectroscopy market for North America, Europe, Asia Pacific, South America, and Middle East and Africa.
Some of the major players operating in the global atomic spectroscopy market are Thermo Fisher Scientific Inc., Agilent Technologies Inc., PerkinElmer, Inc., Bruker, Rigaku Corporation, Aurora Biomed Inc., Buck Scientific Instrument Manufacturing Company, Shimadzu Corporation, Analytik Jena AG, GBC Scientific Equipment, Labnics Equipment, SAFAS Monaco, Hitachi High-Technologies Corporation, Avantor Performance Materials, LLC, and Sigma-Aldrich Co. LLC among others.
Access Full Report: https://databridgemarketresearch.com/reports/global-atomic-spectroscopy-market/
About Us:
Data Bridge Market Research set forth itself as an unconventional and neoteric Market research and consulting firm with unparalleled level of resilience and integrated approaches. We are determined to unearth the best market opportunities and foster efficient information for your business to thrive in the market. Data Bridge endeavors to provide appropriate solutions to the complex business challenges and initiates an effortless decision-making process.
Contact:
Data Bridge Market Research
Tel: +1-888-387-2818
Email: [email protected]
#Global Atomic Spectroscopy Market report#Global Atomic Spectroscopy Market trend#Global Atomic Spectroscopy Market Segments#Global Atomic Spectroscopy Market size#Global Atomic Spectroscopy Market industry
0 notes
Text
@aurora-starlight-silly !!!!!!
(not canon because Lib can't travel between AUs, but still XD)
OG under the cut

10 notes
·
View notes
Text
DAMN I FORGOT TO REBLOG



how that happened is up to your interpretation! :D
can you imagine that I'm so desperate for interacting with @aurora-starlight-silly that I make myself draw things related to Aly... (not that I don't enjoy it ofc, I love drawing (especially those two!!) and I love how my art skills have increased since I started this blog! so, don't worry about that)
18 notes
·
View notes
Text

Lib! don't be rude to your friend!
aurora by aurora ( @aurora-starlight-silly )
18 notes
·
View notes