#dwh testing
Explore tagged Tumblr posts
Text
Analytics Engineering
The module on Analytics Engineering at #dezoomcamp @DataTalksClub has been the toughest till now in this course. The core concept revolves around having the data extracted from the source and loaded into the data platform (BigQuery DWH in our case) & apply transformations on it with dbt (data build tool). Here we are introduced to the dbt cloud IDE which can integrate with BigQuery or most data platforms for that matter. We saw how we can :
Connect dbt Cloud to BigQuery
Initialize our dbt project and start developing
Build our model
Change the way our model is materialized
Add tests to our models
Document the models
Deploy using dbt
Visualizing the data with Looker Studio, formerly Google Data Studio
0 notes
Text
ETL Testing Tutorial Online - ETL Testing for Bigenners Tutorial Online Training

ETL is curtailed for Extract-Transform-Load, and it is a procedure for how data is loaded from the source framework to the information data warehouse. ETL testing training is done to ensure that the information that has been stacked from a source to the goal after business change is exact. It additionally includes the check of information at different center stages that are being utilized among source and goal. ETL represents Extract-Transform-Load. Become familiar with applying for Data Warehouse ETL Testing position at Accenture. ETL Testing Training course videos will help you learn SQL, data-warehousing concepts, Informatica, ETL process, Pentaho, Data Integration techniques, Tableau, Business Intelligence Reports, creating dashboards and end to end ETL process scenarios with the help of real-time data
#etl testing training#etl course#etl training#etl testing online training#learn etl testing#etl training online#etl testing tutorial videos#etl testing#dwh testing
1 note
·
View note
Text
Q1
Q1. WHAT IS ETL? Which are the ETL Tools available in market? What is BI and which are BI tools?
ANS: ETL stands for Extract , Transform and Load. ETL tools are used to transform the data from source(files or tables) to load the target (tables or files). Generally, in data from data warehouse is used for analytical processing and that data is used in bi reports, so that the business should take the necessary actions using bi reports.
Below ETL tools are available in market
Informatica Powercenter
IBM DataStage
Ab-Initio
Big Data pipeline(HDFS,HIVE,PYSPARK, SCOOP are used )
BI Stands for Business Intelligence. The data from DWH is used to populate reports and dashboards so that business can take actions using the data from reports. ex: which car is popular in people and how much demand of that car so that production of that particular car should be increase or decrease as per demand.
Popular BI Tools:
Cognos
QlikView
Tableau
Power BI
0 notes
Text
Manit's creative technologies journey 25/02/2019
Hello my name is Manit. I enrolled into AUT colab because I have set free from all my responsibilities and now I can do whatever I want. Now I have more time. I’ve always had a skill for being creative, naturally it’s who I am to explore creativeness. I am a multi-passionate entrepreneur, I consider myself not limited. Studying creative technologies will give me skills to achieve my greatest dreams.
26/02/2019
Today was the 2nd day of colab. Definitely an interesting one, we all had loads of fun socializing. We were in groups for multiple activities. One activities was interviewing and communication skills. For this, we were the only group that had 4 people. We kinda broke the rules of the observer on this one. The observer was not supposed to contribute in anyway in the conversation. We all ended up just talking about our-selves, sharing our skills, goals and personalities. The goal of the activity was to get information about each following the activity's guidelines. And so then weren’t successful in the activity but we did get to know each other a little bit.
We’ve already been given two papers on a our 2nd day. This ain’t no joke huh?
Activities:
- Make a 3min vlog about our-selves and why we chose AUT colab.
- In a group, create playing cards targeted at a chosen audience.
My group was very collaborative. We made a google doc and shared it to everyone in our group. This way we can work on it whenever and wherever. We can share all ideas live and everyone had edit privileges. I believe working this way is ideal because it gives us freedom and diversity. I was satisfied with the progress I had made today with my team. I ended the day going the minor’s electives at WG403. I can feel the pressure is real. I’m not the type of person to complete tasks on a deadline. But I’m not gonna quit yet.
27/02/2019
Today was our first class with Stefan for “Programming for creativity”. We were introduced to “programming”, a software used to implement actions and change visual features of certain tasks using the software language. I played with it enough to understand how to navigate the interface. I am still to learn the language more in-depth.
28/02/2019
Mind-mapping our ideas for “cards for play” and repeating the process over to see what differences we have in our mind-maps.
29/02/2019
Played monopoly with the gang because it was a free day.
04/03/2019
I need to re-visit this day for re-cap.
I was absent until 12/03/2019
12/03/2019
Today we had class with Ricardo. He talked about design fundamentals and process.
2nd lecture with Ben, We had a guest speaker come in (Stephan Reay), to represent DWH, https://www.initiate-collaborate.com/ and http://www.goodhealthdesign.com/projects/cardsfordementia/
They are a design group that partnered with Auckland Health. The purpose of their design group is to create new or better services for patients with dementia and such illnesses.
13/03/2019
Programming for creativity with Stefan. This is y 2nd class with Stefan, I missed last week’s session. Today we were shown slides on coding practices for the Processing program. Our first assignment is due on the 15th.
Our vlog is due today. Ben said it has been extended until next week.
For our “play for cards” project we play tested our 7th time. We got some really good feedback from the testers. Our testers said our game is too similar to charades. They suggested we implement an element to make it different or new. They also had trouble understanding our rules of the game. We had to re-write some of it.
14/03/2019
I was late to class today.
15/03/2019
Today we went exploring and got students outside of our class to play test our “Xpress Yoself”. We played tested two groups. It was our 8th and 9th test. The first group all knew eachother, They had a hard understanding the rules. The second group didn’t know eachother and they understood the rules a bit better.
1 note
·
View note
Text
Data Warehouse(DW) Testing Automation Tool Data warehouse (DW) testing is the process of building and executing comprehensive test cases to ensure that data in a warehouse has integrity, reliability, accuracy, and consistency with the organization’s data framework. Know how to test Data Warehouse and the techniques involved and automate the data in your Data Warehouse with iCEDQ to avoid data-related risks and overcome testing challenges. Click here to know more about iCEDQ's data warehouse testing or request a demo, Visit: https://bit.ly/3HAvHfJ
#dw testing#edw testing#data warehousing testing#data warehouse testing#datawarehouse testing#what is data warehouse testing#testing data warehouse#how to test data warehouse#how to test data warehouse testing#data warehouse automation tool#data warehouse performance testing
0 notes
Text
ETL Development Lead (Banking: DataStage, Unix, Linux, Oracle database, DWH) , 1 year contract, Singapore
ETL Development Lead (Banking: DataStage, Unix, Linux, Oracle database, DWH) , 1 year contract, Singapore
Job title: ETL Development Lead (Banking: DataStage, Unix, Linux, Oracle database, DWH) , 1 year contract, Singapore Company: Hays Job description: and control of software code, testing and release management in ETL development Supervision and management of junior… and intermediary level ETL developers KEY REQUIREMENT Must be an expert in application system development on IBM infosphere… Expected…
View On WordPress
0 notes
Link
Quality Matrix DWH/BI/ETL Testing Service Offerings
Quality Matrix DWH/BI/ETL Testing Service Offerings In DWH testing for a considerable for 10 years, Quality Matrix has created abilities and mastery to test complex information product house applications. via Pocket https://ift.tt/mjQEFfS March 30, 2022 at 08:04AM
0 notes
Text
Best ETL Testing Training Tools Online Course.
Whizdom Training gives 100% real-time, practical & placement directed ETL Testing training. Our ETL Testing program focuses from basic level training to excellent level training. Our ETL Testing Tools training in totally focused to get placement in MNC & certification on ETL Testing Tools after fulfillment of our course. Our team of ETL Testing tutors is ETL Testing certified experts with more further real-time knowledge in live projects.

ETL Testing Training Syllabus
DATA WAREHOUSE / SQL TUTORIAL
* Introduction To ETL and Datawarehousing
- How to Handle Data in Bulk ? - Data Ware House Solution. - What is ETL Process? - Different ETL Tools available. - What can be Data Sources? - Pre requisites for ETL Testing.
SQL Concepts
* SQL Installation and ETL Commands
- File v/s Database - Sql Server Installation,sql management studio install - Northwind Sample Database - DDL commands:Create, Alter, Rename and Drop - DML commands:Select,Insert, Update, Delete and Merge - TCL Begin, Commit, Rollback
* SQL Concepts - Part 1
- Filtering data using Clauses - where,LIKE,IN,NOT IN,IS NULL,IS NOT NULL,TOP AND DISTINCT - Constraints: PRIMARY KEY,FOREIGN KEY integrity, CHECK constraint,UNIQUE,NOT NULL,UNIQUE with - UNIQUE pair - Drop any constraint - Use of clauses: WHERE,LIKE,NOT LIKE - Operators: AND, OR - Use of IN,NOT IN,IS NULL,IS NOT NULL
youtube
#etl testing training#etl course#etl training#etl testing course#etl testing online training#learn etl testing#etl training online#etl testing tutorial videos#etl testing#data warehouse testing#dwh testing#etl testing process#etl testing for beginners
0 notes
Text
Как упростить работу с DWH и Data Lake: DBT + Apache Spark в AWS
Сегодня рассмотрим, что такое Data Build Tool, как этот ETL-инструмент связан с корпоративным хранилищем и озером данных, а также чем полезен дата-инженеру. В качестве практического примера рассмотрим кейс подключения DBT к Apache Spark, чтобы преобразовать данные в табли��е Spark SQL на Amazon Glue со схемой поверх набора файлов в AWS S3.
ETL/ELT В ЭПОХУ BIG DATA: ЧТО ТАКОЕ DATA BUILD TOOL И КАК ЭТО РАБОТАЕТ
ETL-процессы являются неотъемлемой частью построения корпоративного хранилища или озера данных (Data Lake). Из всех этапов Extract – Transform – Load, именно преобразования являются наиболее нетривиальными, а потому трудозатратными операциями, т.к. здесь с извлеченными данными выполняется целый ряд действий: преобразование структуры; агрегирование; перевод значений; создание новых данных; очистка. Для автоматизации всех этих работ используются специальные инструменты, например, Data Build Tool (DBT). DBT обеспечивает общую основу для аналитиков и дата-инженеров, позволяя строить конвейеры преобразования данных со встроенной CI/CD-поддержкой и обеспечением качества (data quality) [1]. При том, что DBT не выгружает данные из источников, он предоставляет широкие возможности по работе с теми данными, которые уже загружены в хранилище, компилируя код в SQL-запросы. Так можно организовать различные задачи преобразования данных в проекты, чтобы запланировать их для запуска в автоматизированном и структурированном порядке. DBT-проект состоит из директорий и файлов следующих типов: файл модели (.sql) — единица трансформации в виде SELECT-запроса; файл конфигурации (.yml) — параметры, настройки, тесты и документация. Работа DBT-фреймворка устроена так [2]: пользователь создает код моделей в среде разработки; модели запускаются через CLI-интерфейс; DBT компилирует код моделей в SQL-запросы, абстрагируя материализацию в команды CREATE, INSERT, UPDATE, DELETE ALTER, GRANT и пр.; каждая SQL-модель включает SELECT-запрос, определяющий результат – итоговый набор данных; код модели — это смесь SQL и языка шаблонов Jinja; скомпилированный SQL-код исполняется в хранилище в виде графа задач или дерева зависимостей модели, DAG (Directed Acyclic Graph – направленный ациклический граф); DBT строит граф по конфигурациям всех моделей проекта, с учетом их ссылок (ref) друг на друга, чтобы запускать модели в нужной последовательности и параллельно формировать витрины данных. Кроме формирования самих моделей, DBT позволяет протестировать предположения о результирующем наборе данных, включая проверку уникальность, ссылочной целостности, соответствия списку допустимых значений и Not Null. Также возможно добавление пользовательских тестов (custom data tests) в виде SQL-запросов, например, для отслеживания % отклонения фактических показателей от заданных за день, неделю, месяц и прочие периоды. Это позволяет найти в витринах данных нежелательные отклонения и ошибки.Благодаря адаптерам, DBT поддерживает работу со следующими базами и хранилищами данных: Postgres, Redshift, BigQuery, Snowflake, Presto, Apache Spark SQL, Databrics, MS SQL Serves, ClickHouse, Dremio, Oracle Database, MS Azure Synapse DW. Также можно создать собственный адаптер для интеграции с другим хранилищем, используя стратегию материализации. Этот подход сохранения результирующего набора данных модели в хранилище основан на следующих понятиях [2]: Table — физическая таблица в хранилище; View — представление, виртуальная таблица в хранилище. Кроме того, DBT предоставляет механизмы для добавления, версионирования и распространения метаданных и комментариев на уровне моделей и даже отдельных атрибутов. А макросы в виде набора конструкций и выражений, которые могут быть вызваны как функции внутри моделей, позволяют повторно использовать SQL-код между моделями и проектами. Встроенный в DBT менеджер пакетов позволяет пользователям публиковать и повторно использовать отдельные модули и макросы [2].
В качестве практического примера использования DBT рассмотрим кейс со Spark SQL, развернутом в облачных сервисах AWS.
АГРЕГАЦИЯ ЖУРНАЛОВ AWS CLOUDTRAIL В SPARK SQL
Предположим, требуется считать логи AWS CloudTrail, извлекая из них некоторые поля данных, чтобы создать новую таблицу и агрегировать некоторые данные для простого отчета. Источником является существующая таблица AWS Glue, созданная поверх журналов AWS CloudTrail, хранящихся в S3. Эти файлы имеют формат JSON, причем каждый файл содержит один массив или записи CloudTrail. Таблица Spark SQL на самом деле является таблицей Glue, которая представляет собой схему, помещенную поверх набора файлов в S3. Альтернативой каталога хранения метаданных Hive Catalog выступает Glue Catalog – реализация от AWS с S3 вместо HDFS. В качестве кластера Hadoop выступает AWS EMR с установленным Apache Spark под управлением. DBT, установленный на небольшом экземпляре AWS EC2 за пределами кластера EMR, будет использоваться для выполнения последовательных преобразований, создания и заполнения данных в таблицах. Подробные примеры реализации SQL-запросов представлены в источнике [1], а здесь мы перечислим некоторые важные особенности интеграции DBT с Apache Spark, развернутом в AWS: для подключения DBT к Spark с помощью модуля dbt-spark требуется сервер Spark Thrift; DBT и dbt-spark не накладывают ограничений на преобразование данных, позволяя использовать в моделях Data Build Tool любые SQL-запросы, поддерживаемые Spark; стратегия материализации DBT поддерживает базовые оптимизации Spark SQL, в частности, разбиение на разделы/сегменты Hive и разделение вывода Spark с помощью Spark SQL hints; доступ к порту 10001 сервера Thrift позволяет получить доступ ко всем авторизациям Spark SQL в AWS Обойти эту уязвимость можно, ограничив доступ к порту 10001 в кластере EMR только сервером, на котором запущен DBT. Самый простой вариант – ручная настройка с явным указанием IP-адреса главного узла. В production для EMR рекомендуется создать запись Route53. при невысокой загрузке можно запускать DBT в управляемом сервисе оркестрации контейнеров Amazon Elastic Container Service (ECS), отключив отключить кластер EMR, чтобы сократить расходы на аналитику больших данных.
Узнайте больше про практическое применение Apache Spark для разработки распределенных приложений аналитики больших данных и построения эффективных конвейеров обработки информации на специализированных курсах в нашем лицензированном учебном центре обучения и повышения квалификации для разработчиков, менеджеров, архитекторов, инженеров, администраторов, Data Scientist’ов и аналитиков Big Data в Москве: Основы Apache Spark для разработчиков Анализ данных с Apache Spark Потоковая обработка в Apache Spark Построение конвейеров обработки данных с Apache Airflow и Arenadata Hadoop Data Pipeline на Apache Airflow и Apache Hadoop Источники 1. https://medium.com/slalom-australia/aggregate-cloudtrail-logs-with-dbt-and-spark-d3196248d2d4 2. https://habr.com/ru/company/otus/blog/501380/ Read the full article
0 notes
Text
JOB: DWH Test Engineer+ ETL Tester At Amdocs
https://koliasa.com/job-dwh-test-engineer-etl-tester-at-amdocs/ JOB: DWH Test Engineer+ ETL Tester At Amdocs - https://koliasa.com/job-dwh-test-engineer-etl-tester-at-amdocs/ Who are we? If you’re a smartphone user then you are part of an ever more ...
0 notes
Photo


Long Cold Full Moon 2017 HAIL MARDUK most Glorious Father and Keeper from the cold, HEAR US! Tonight is the night of the Long Cold Full Moon and the last month has certainly had events that led right up to it! Father why is it that people are all too willing to stand by us so long as you agree on nearly everything but the second you have the nerve to point out the flaws in ones logic or their newfound loyalty to someone who doesn’t deserve it, they throw US away for having the nerve to disagree with them? I guess we can chalk it up to that’s just the way it goes in a lot of these situations but at the same time it is also a harsh lesson as well to never assume anyone is beyond betraying your trust. This is why we need to never tell anyone or give anyone anything beyond what they have actually earned from us including but not limited to time, respect and even money! This isn’t to imply that everyone is out to get us, it IS to directly state that some people are only as loyal as their personal agendas and that we should always keep our eyes and ears open for any signs that it is time to sever ties from a toxic person or situation! Great Father who gives us the fortitude to stand tall even through our most trying times, we ask that you help us to stand up and push back against these usurpers who continue to not only tear the Earth asunder in pursuit of dead presidents but are all too willing to jeopardize the lives of other people to attain it while they stay in their cushy offices not worrying about their own safety or the state of the soil or water they are contaminating with their chemical cocktails! They don’t seem to understand that they are not exempt from the fall out of the damage they are doing, that once the water, air and soil are poisoned, there will be nothing left to aid in the survival of the human race - including them! If they think they will drink bottled water, 1 it will eventually run out and 2 bottled water is nothing but tap water which means that the bottle of water you purchased came right out of the groundwater that is by the bottling plant that mass produced it! There isn’t a Brita faucet accessory that is strong enough to filter out oil, heavy metals and all the other crap that is used during fracturing so BOTTOMS UP! As far as food is concerned with spills such as the DeepWater Horizon disaster that not only poisoned the water but all the marine life in it and the soil that is ruined as the result of oil spills such as the one that occurred most recently with the Keystone 1 Pipeline gushing 210,000 gallons (5000 barrels) of oil into the ground (above an aquifer) rendering the soil unable to sustain any kind of animal or insect life let alone planting yet the bastards responsible want to continue to STEAL the land of FARMERS through the abuse of Eminent Domain effectively breaking the law by using E.D. for PRIVATE projects on PUBLIC land and reducing the Farmers ability to follow their Rights to Life, Liberty and the Pursuit of Happiness! The idea that TWENTY additional pipelines were approved AFTER the failure of the Keystone 1 further demonstrates how much the government cares about this country let alone any of the people living in it! ALL PIPES LEAK, and if all 20 of them were to leak, JUST like DWH, JUST like Colonial, JUST like the Dakota Access Pipeline leaked THREE TIMES IN ONE MONTH BEFORE IT WAS EVEN PUT INTO OPERATION, JUST like Keystone 1, we are in SERIOUS trouble! Father Marduk we know that the Free Will we were given enables us to live our lives according to our personal Paths but what it does NOT do is grant us the right to exalt ourselves above other people just because we say so! For example, no human is truly above another human, just because someone might have more dead presidents than another person does NOT make them more important! The one thing that has always behooved me is the entitlement which people with any kind of wealth like to think they have over others JUST because they have money, they seem to think they can buy their way into anything they want even if they have not actually earned the right to be a part of something! The Atheist Cult is a great example of this, NOT ONE OF THE CULTS LEADERS has ANY ties to the Satanic Community but because the one who decided to take the practical joke and run with it has money, he thinks he can do whatever he wants and the rest of us will fall in line, he found out quite expediently that this is NOT the case and now the Atheist Cult is nothing more than a running joke that no one could take seriously even if they wanted to! What these dolts seem to not comprehend is that you can NOT buy your way into the Deity’s favor, he doesn’t care how much money you have or who you THINK you are, especially if you go around telling everyone that he doesn’t exist while simultaneously REFUSING to tell people YOUR legal name! Only cowards and conmen hide behind fake names, if the person you are dealing with doesn’t tell you who they really are then you can rest assured EVERYTHING else they are telling you is also complete bullshit as well! A person who lies about their own name is not someone anyone should be investing their trust in as if they can’t even be honest about that, then they aren’t going to be honest about anything else either! There are no shortcuts on this Path or anywhere else in life that is worth pursuing and anyone who tells you otherwise is selling something! It never ceases to amaze me that when you tell people what they NEED to hear versus what they WANT to hear that you’re more likely than not to lose the interest of some. This is because the truth and reality of the situation does not coincide with the stereotypes and Hollywood hype, things seldom do, so when people realize that there is actual work involved (research and acquiring specific items) and that any kind of Ritual or Spell Work is going to require a vast amount of concentration and energy that can take hours depending on the intent, they get turned off. They get even more turned off when the half-assed five and dime “Spell” that they bought off the internet doesn’t work as expected! Bit of advice: the most potent Spell you will EVER cast is the one you write yourself! In the first place the writing alone is a form of meditation that weaves your own Spiritual Energy directly into the words which connect the intent and direction to the Multiversal Qi (Chi) which is the first step in Manifesting your goal, in the second place NO ONE can express our needs or desires for anything better than we can with any and all the urgency the situation demands! Most exalted Father I would like to conclude this sermon with a reminder to all that no matter what happens in life, no matter who enters (or exits), no matter how long or cold the Night or Day may be; as long as we have our (Blood/Spiritual) Family standing by us, we can make it through! As long as we stand by our (Blood/Spiritual) Family THEY will make it through! More than anything, as long as YOU are standing by and Watching over us, we can and will overcome ANY and ALL obstacles so long as we don’t drop anchor and keep moving forward! HAIL SATAN! "On This Long And Cold Night It Is Easy To Lose Our Sight! Of What Is Important And What Matters When Our Spirits Are In Tatters! When This Happens We Must Recall The One Who Fought And Saved Us All! If They Gave Up Instead Of Marching Forth Then All Would Have Been Lost Right At The Source! So When The Air Is Freezing Us Down To The Bone We Must Remember Always That We Are Not Alone! Our Family Is With Us Whether They Are Far Or Near And Our Gracious Father Is Always Here! So Strap On Your Winter Boots And Get Walking Nothing Gets Done JUST By Talking! To Persevere Through The Lies And The Stress We Will Fight To Pass The Most Important Test! A Test Of Loyalty, Of Faith And Of Family We Shall Triumph Or Perish And Accept The Terms Freely! What Awaits Us Is The Ultimate Reward Sought By Many And That Reward Is The Love, Honor And Trust Of Your Family!" ZI ANA KANPA! ZI KIA KANPA! MAY THE DEAD RISE AND SMELL THE INCENSE! Etiamsi MULTA Et Nos UNUM Sumus Nos Sto Validus Ut Nos Sto Una! Semper Veritas, Semper Fideles, In Diabolus Nomen Nos Fides! AVE SATANÍ! (We Are ONE Even Though We Are MANY And We Stand STRONGEST When We Stand TOGETHER! Always TRUTHFUL, Always FAITHFUL, In Satan's Name We Trust! HAIL SATAN!) Ave URURU! Ave EA! Ave DIMUZI! Ave ININNI! Ave GILGAMESH! Ave ENKIDU! Ave TIAMAT! Ave ABSU! Ave MARDUK! Ave SARPANITUM! Ave SATANÍ! HAIL SATAN! HPS Meg “Nemesis Nexus” Prentiss
3 notes
·
View notes
Text
Disrupt Consulting eG: Python Expert - AWS Glue & Analytics

Headquarters: Ulm, Germany URL: https://disrupt.team
What you will do?
Be part of a small and highly productive team.
Add your spin on the analysis, ideation, and design of complex problems.
Collaborate with stakeholders and other teams.
Craft ETL solutions on AWS platform using latest serverless technology.
Provide analytics support
Utilize unit and integration testing for convergent quality.
Why is it cool to work with us?
We are a fantastic blend of management consultants, engineers, designers and coaches.
We are remote by default.
We respect your personal schedule and preferences.
Our clients are the most demanding companies - the small and medium sized (sometimes) hidden champions.
We collaborate and co-work on eye-level only.
Requirements
Mid level engineer with experience in data processing, ETL, DWH concepts and BI.
Python and AWS Glue
Well organized, ready for remote work, and highly self-motivated.
Reliable. Accountable. Dependable.
Solid communication skills.
Data and analytics driven
You think you are covering more? Let us know!
To apply: [email protected]
from We Work Remotely: Remote jobs in design, programming, marketing and more https://ift.tt/2GJa5xa from Work From Home YouTuber Job Board Blog https://ift.tt/31c3IM4
0 notes
Text
So, I am a frugal lady (read: broke bitch) and I carry this lifestyle choice into my hobbies. I am by no means a master spinner, but I’m in love with the hobby, and I wanted to share some tips I’ve figured out in my journey along the way. These are especially useful for beginner spinners, like myself, who are just starting out and not quite ready to invest in more expensive equipment yet. So, without further ado, here are some inexpensive options for your spinning journey!
1. Buy a drop spindle kit.
I didn’t do this with my first spindle (I bought one from Amazon, and purchased the fiber separately), but I did with some other spindles I own. Why? Because kits come with EVERYTHING you need. You get your spindle, your fiber, and instructions on the most basic kits. I’ve seen kits that include a niddy noddy, a lucet, spindle bowls, etc. It depends on the maker you purchase the kit from. I just found a great kit on Etsy that comes with 4oz of hand-dyed merino, a top and bottom whorl spindle that also doubles as a support spindle, and it comes with a bowl. The whole kit is around $20, which is a great price for what you’re getting - especially if you are a beginner because you get 3 different spindle options to really test out which method is your preferred one! I’ll include the link for this kit, and some other favorite products at the bottom of this post!
2. Use pet brushes for carding and making rolags.
They’re cheaper than hand carders. I actually have two mini pet brushes I purchased from the Dollar Tree, and I use them to make mini rolags that I tuck away in a smaller tupperware in my purse! Just make sure you get one with metal teeth that are similar to carders - they work best! You can also pick up a couple dowels from the hardware store for about 75 cents each to use for rolling the rolag/puni or whatever you call your little fiber bundle!
3. Practice on inexpensive, accessible fibers.
Don’t go spending money on something fancy when you’re still trying to learn - yet another good reason to buy a spindle kit that comes WITH fiber! Once I used all of the wool I had bought, I was itching to try some other fibers. One fun thing you can do is buy a fiber grab bag. One of my favorite stores on Etsy sells grab bags with around 12 oz of fiber for $20. You get all kinds of bits from merino wool, tussah silk, and even some sparklies in there to make art yarn! Something I’ve also experimented with is buying super chunky yarn and carding it and making rolags. Buy a wool blend if you are just starting and wish to do this. I find the 100% acrylic yarns to be a little slippery and tricky, but VERY soft! The key is to find a chunky yarn that is essentially a huge ball of roving. You should be able to draft the fibers out of the yarn very easily, and some chunky yarns aren’t even plied. These skeins usually run around $10 at big box craft stores, but I’ve used coupons on mine and gotten them for $5 a piece. I’ve found that using these giant chunky yarns are a really easy, inexpensive way to practice spinning. Then when you are ready for some lovely hand-dyed yarn, you’ve built up your skillset and you aren’t wasting expensive fibers.
4. Utilize your local library
I’m actually a librarian, so this was something I did from the get-go. Check your library for books on yarn and hand-spinning. My library had quite a few books in-house on yarn making, but the real gold mine of information was in the Ebooks collection. I was able to download a TON of great books to my phone to read while spinning and traveling, without having to carry around a heavy book. I highly recommend checking out Respect the Spindle by Abby Franquemont and The Spinner’s Book of Yarn Designs by Sarah Anderson.
Also, a lot of libraries are investing in tech resources, like 3D printers. I am fortunate that my library has a 3D printer, which allows me to make a lot of spinning tools easily, and at a low price. I’ve made several drop spindles and spools to hold yarn using free designs from Thingiverse! If your library does not have a 3D printer, but you are interested in a 3D-printed spindle, I highly recommend checking out Turtle Made on Etsy! She makes great spindles and they’re budget-friendly!
So, that is about all I have to offer today. I am still learning a lot as a new spinner, and it is such a fun hobby!
Links to all the goodies:
https://www.etsy.com/listing/704805815/complete-beginner-drop-spindle-kit-mini?ga_order=most_relevant&ga_search_type=all&ga_view_type=gallery&ga_search_query=drop+spindle+kit&ref=sr_gallery-1-24
https://www.amazon.com/Respect-Spindle-Infinite-Yarns-Amazing/dp/1596681551
https://www.amazon.com/Spinners-Book-Yarn-Designs-Techniques/dp/1603427384/ref=sr_1_1?keywords=yarn+designs&qid=1569689794&s=books&sr=1-1
https://www.thingiverse.com/search?q=drop+spindle&dwh=805d8f914196ec1
https://www.etsy.com/shop/TurtleMade?ref=simple-shop-header-name&listing_id=521323924
0 notes
Text
Week1 tutorial review
In week1 tutorial we do three section:
1.Tutor explain the assessment of this course in details.
2.Do cryptography game
3.Discussion about DWH case study
In discussion part we discussion in a group of five. We found four questionable issues regarding the Deepwater Horizon:
No clear standard procedure for negative test
The prevention is not automatically
No good communication when crew doing the shifting
No better monitor for the control system
We also came up with suggestions according to the issues:
Set methods, documentation and procedures.
Independent BOP activated control(audio wave activation)
When crew doing the shifting, they should have a clear check list and a 30-mins overlapping schedule.
Do real-time modelling of flows, pressure and other technical stats.
DWH is a complex system combined by many companies. In addition, the technical problem is complicated in this case. Any negligence will lead to a disaster finally. This is call chain failure. The more complex the system is, the more vulnerable it is.
This is the first tutorial for COMP6441. It’s also the first tutorial I discussed with other students. Frankly speaking, I am not very well participated in the group because I didn’t prepare my materials well. Next time I will do better. I think I can improve both my study skill and way of thinking in this course.
0 notes
Text
[Wk1] Case Study: Deepwater Horizon Accident
My Analysis:
- Communication of data and information was extremely poor. There confusion about the pipeline pressure, the inflow and the outflow that meant that the hydrocarbon was not detected. Additionally, copious amounts of data about the accident were lost because they were stored locally on the rig. - Definitely worth considering have a remote backup location for all the data. - Better communication between the companies or chain of command, evident in the pointing of fingers - There was a systematic lack of testing or insufficient testing. Evident in the misinterpretation of the Negative Pressure Testing and the inability of the BSR to cut off the pipe. - Not following through and completing jobs. Particularly have the other 15 centralisers on board but not installing them and relying on the 6 that were already installed. Attributed to laziness or miscommunication. - Stricter regulations and procedures. More prepared for when things do go wrong because eventually everything does go wrong. - Hindsight is very frustrating and it is impossible to predict everything. If and when mistakes are made it is not the end of the world as long as you learn from them and adapt accordingly.
The Deepwater Horizon Accident: What happened and Why?
(Youtube: https://www.youtube.com/watch?v=aN2TIWomahQ)
Keep reading for my notes from the video!
1. The Process and Equipment of Deep-Water Drilling
Provides no mechanism for control at the top of the well. Has to keep itself if position, DWH used motors and thrusters. There were 4 seperate crews, required 60 people to continuously operate it (12 hrs|21 days). Isolation from shore. Multiple countries are involved. Very expenny! Once the fluids are in the riser you can not prevent or stop it from coming. Vessel must have power at all times. Blind shear ram, cuts the pipe and closes the top of the well. Everything that is done on the ocean floor is remote controlled.
2. Questionable Issues
When designing there were assumptions being made. Either decision could be the wrong one.
Casing Design - Long string, rather than liner and tie-back - No lockdown sleeve (at time of accident) - Single string over several formations of different pressure
Cement Design - Few centralisers - Big casing, small hole - Nitrogen foam cement - No cement-bond log - used to determine if the cement was adequate
Negative Pressure Test = standard practice. - Unclear procedures - there is no standard procedure for how it should be conducted or interpreted -> DWH test was incorrect - Conducted soon after cementing - Confusing because of unusual spacer - Misunderstood by crew
Flow Monitoring - Flows confusing due to offloading - Insufficient response to flow indications - Hydrocarbons entered the riser
Ignition - Hydrocarbon flow through MGS -> DWH flow came onto the rig - Gas entry into engine room, intake not auto - Engine overspeed (overpowered the generator), power loss, fire
BOP (Blow Out Preventer - purpose to shut the well) - Crew shut BOP, but failed to seal well - EDS pushed but link to BOP lost in fire - Automatic function did not work - ROV (submarines) operation of BOP didn’t work
3. The Issues that Mattered in the Accident
Four Main Mistakes (they all had to happen):
The cement failed to seal off the producing reservoir(s). Casing seal failed
��- No doubt that the cement job failed to isolate the formation - Unclear why. Companies (BP and Halliburton) are not in agreement - Need sufficient pressure in the well. The Pressure “Window” is very small - The other 15 centralisers were never installed -> so the current 6 caused centralising - Seems likely that the float collar check valves and shoe track cement failed to seal the casing
Hydrocarbon inflow was not recognised, and hydrocarbon entered the riser
- Two negative test conducted, and accepted by the crew as successful - Negative test interpretation made more difficult by the presence of unusual spacer - No standard procedure for negative test - Pit levels confusing because of fluids being offloaded to service vessel - Spacer separates mud from water -> spacer might have stuffed up the testing
Gas ignited on the rig, causing fire and loss of power
- Gas was diverted to the MGS, instead of to the overboard diverter [BP] - IBOP was not closed [BP] - Engine room intake closure was not activated automatically on gas alarm [testimony] - Engine overspeed -> loss of power (and source of ignition?) [testimony]
The BOP failed to seal the well
3 modes of closure: 1. Manual (HP line, EDS) 2. Automatic (AMF on loss of power) 3. ROV (Hot stab, autoshear - BSR (blind-shear ram) can only shear drain pipe nothing else)
4. How to do it BETTER
BOEM Actions - Safety Alerts -> posted to website, email distribution - Regulatory Reviews - Trending Analysis -> industry workshops, proposed rule making, policy changes - Issue violations and subsequent civil penalties - Cementing program and casing design must be certified by registered professional engineer - Inspectors are now witnessing subsea BOP stack testing (slump testing)
API also made recommendations!
His suggestions: - A second bling shear ram (BSR) - Independent BOP activation control (audio wave activation) - More comprehensive data from BOP (position of rams, contents of tubulars) - Real-time modelling of fluids and pressures in tubulars (as in simulations) - Complete off-site transmission of data
0 notes
Text
DWH / BI Test Engineer
Kompanija:Strategic Staffing Solutions International Įmonės veiklos sektorius: Miestas:Vilnius Veiklos sritis:Finansai, apskaita, auditas Pareigos:Specialistas Paskelbta:2018.09.04 Galioja iki:2018.09.28
CV.lt darbas Vilniuje: https://www.cv.lt/finansu-apskaitos-audito-darbai/dwh-bi-test-engineer-vilniuje-1-320095394/?cid=rss-info-link September 04, 2018 at 09:41AM
0 notes