#puppetlabs
Explore tagged Tumblr posts
kennak · 5 months ago
Quote
重要な話がチャットでそのまま繰り広げられそうな勢いだったので、大元の GitHub イシューの方に巻き戻した。重要なコンテキストの共有が雑にチャットだけで済まされると、後で経緯を追う検索がしんどくなる (or できなくなる) とか容易に想像がつくので、自分の方でオープンに行う形で巻き取ることにした。フロー型とストック型の情報ツールの使い分け、ちゃんとやろうな。
rubocop-i18nをpuppetlabsからrubocop orgに移管した - koicの日記
2 notes · View notes
learnthingsfr · 2 years ago
Text
0 notes
globalmediacampaign · 4 years ago
Text
Automate Database Schema Object Check
Monitoring your database schema changes in MySQL/MariaDB provides a huge help as it saves time analyzing your database growth, table definition changes, data size, index size, or row size. For MySQL/MariaDB, running a query referencing information_schema together with performance_schema gives you collective results for further analysis. The sys schema provides you views that serve as collective metrics that are very useful for tracking database changes or activity. If you have many database servers, it would be tedious to run a query all the time. You also have to digest that result into a more readable and easier to understand. In this blog, we'll create an automation that would be helpful as your utility tool to have for your existing database to be monitored and collect metrics regarding database changes or schema change operations.  Creating Automation for Database Schema Object Check In this exercise, we will monitor the following metrics: No primary key tables Duplicate indexes Generate a graph for total number of rows in our database schemas Generate a graph for total size of our database schemas This exercise will give you a heads up and can be modified to gather more advanced metrics from your MySQL/MariaDB database.   Using Puppet for our IaC and Automation This exercise shall use Puppet to provide automation and generate the expected results based on the metrics we want to monitor. We'll not cover the installation and setup for the Puppet, including server and client, so I expect you to know how to use Puppet. You might want to visit our old blog Automated Deployment of MySQL Galera Cluster to Amazon AWS with Puppet, which covers the setup and installation of Puppet. We'll use the latest version of Puppet in this exercise but since our code consists of basic syntax, it would run for older versions of Puppet. Preferred MySQL Database Server In this exercise, we'll use Percona Server 8.0.22-13 since I prefer Percona Server mostly for testing and some minor deployments either business or personal use. Graphing Tool  There are tons of options to use especially using the Linux environment. In this blog, I'll use the easiest that I found and an opensource tool https://quickchart.io/. Let's Play with Puppet The assumption i have made here is that you have setup master server with registered client which is ready to communicate with the master server to receive automatic deployments. Before we proceed, here's the my server information: Master server: 192.168.40.200 Client/Agent Server: 192.168.40.160 In this blog, our client/agent server is where our database server is running.In a real-world scenario, it doesn't have to be especially for monitoring. As long as it's able to communicate into the target node securely, then that is a perfect setup as well. Setup the Module and the Code Go to the master server and in the path /etc/puppetlabs/code/environments/production/module, let's create the required directories for this exercise: mkdir schema_change_mon/{files,manifests}   Create the files that we need touch schema_change_mon/files/graphing_gen.sh touch schema_change_mon/manifests/init.pp Fill-up the init.pp script with the following content: class schema_change_mon ( $db_provider = "mysql", $db_user = "root", $db_pwd = "R00tP@55", $db_schema = [] ) { $dbs = ['pauldb', 'sbtest'] service { $db_provider : ensure => running, enable => true, hasrestart => true, hasstatus => true } exec { "mysql-without-primary-key" : require => Service['mysql'], command => "/usr/bin/sudo MYSQL_PWD="${db_pwd}" /usr/bin/mysql -u${db_user} -Nse "select concat(tables.table_schema,'.',tables.table_name,', ', tables.engine) from information_schema.tables left join ( select table_schema , table_name from information_schema.statistics group by table_schema , table_name , index_name having sum( case when non_unique = 0 and nullable != 'YES' then 1 else 0 end ) = count(*) ) puks on tables.table_schema = puks.table_schema and tables.table_name = puks.table_name where puks.table_name is null and tables.table_type = 'BASE TABLE' and tables.table_schema not in ('performance_schema', 'information_schema', 'mysql');" >> /opt/schema_change_mon/assets/no-pk.log" } $dbs.each |String $db| { exec { "mysql-duplicate-index-$db" : require => Service['mysql'], command => "/usr/bin/sudo MYSQL_PWD="${db_pwd}" /usr/bin/mysql -u${db_user} -Nse "SELECT concat(t.table_schema,'.', t.table_name, '.', t.index_name, '(', t.idx_cols,')') FROM ( SELECT table_schema, table_name, index_name, Group_concat(column_name) idx_cols FROM ( SELECT table_schema, table_name, index_name, column_name FROM statistics WHERE table_schema='${db}' ORDER BY index_name, seq_in_index) t GROUP BY table_name, index_name) t JOIN ( SELECT table_schema, table_name, index_name, Group_concat(column_name) idx_cols FROM ( SELECT table_schema, table_name, index_name, column_name FROM statistics WHERE table_schema='pauldb' ORDER BY index_name, seq_in_index) t GROUP BY table_name, index_name) u where t.table_schema = u.table_schema AND t.table_name = u.table_name AND t.index_name<>u.index_name AND locate(t.idx_cols,u.idx_cols);" information_schema >> /opt/schema_change_mon/assets/dupe-indexes.log" } } $genscript = "/tmp/graphing_gen.sh" file { "${genscript}" : ensure => present, owner => root, group => root, mode => '0655', source => 'puppet:///modules/schema_change_mon/graphing_gen.sh' } exec { "generate-graph-total-rows" : require => [Service['mysql'],File["${genscript}"]], path => [ '/bin/', '/sbin/' , '/usr/bin/', '/usr/sbin/' ], provider => "shell", logoutput => true, command => "/tmp/graphing_gen.sh total_rows" } exec { "generate-graph-total-len" : require => [Service['mysql'],File["${genscript}"]], path => [ '/bin/', '/sbin/' , '/usr/bin/', '/usr/sbin/' ], provider => "shell", logoutput => true, command => "/tmp/graphing_gen.sh total_len" } }   Fill up the graphing_gen.sh file. This script will run on the target node and generate graphs for the total number of rows in our database and also total size of our database. For this script, let's make it simpler ,and allow only MyISAM or InnoDB type of databases. #!/bin/bash graph_ident="${1:-total_rows}" unset json myisam innodb nmyisam ninnodb; json='' myisam='' innodb='' nmyisam='' ninnodb='' url=''; json=$(MYSQL_PWD="R00tP@55" mysql -uroot -Nse "select json_object('dbschema', concat(table_schema,' - ', engine), 'total_rows', sum(table_rows), 'total_len', sum(data_length+data_length), 'fragment', sum(data_free)) from information_schema.tables where table_schema not in ('performance_schema', 'sys', 'mysql', 'information_schema') and engine in ('myisam','innodb') group by table_schema, engine;" | jq . | sed ':a;N;$!ba;s/n//g' | sed 's|}{|},{|g' | sed 's/^/[/g'| sed 's/$/]/g' | jq '.' ); innodb=""; myisam=""; for r in $(echo $json | jq 'keys | .[]'); do if [[ $(echo $json| jq .[$r].'dbschema') == *"MyISAM"* ]]; then nmyisam=$(echo $nmyisam || echo '')$(echo $json| jq .[$r]."${graph_ident}")','; myisam=$(echo $myisam || echo '')$(echo $json| jq .[$r].'dbschema')','; else ninnodb=$(echo $ninnodb || echo '')$(echo $json| jq .[$r]."${graph_ident}")','; innodb=$(echo $innodb || echo '')$(echo $json| jq .[$r].'dbschema')','; fi; done; myisam=$(echo $myisam|sed 's/,$//g'); nmyisam=$(echo $nmyisam|sed 's/,$//g'); innodb=$(echo $innodb|sed 's/,$//g');ninnodb=$(echo $ninnodb|sed 's/,$//g'); echo $myisam "|" $nmyisam; echo $innodb "|" $ninnodb; url=$(echo "{type:'bar',data:{labels:['MyISAM','InnoDB'],datasets:[{label:[$myisam],data:[$nmyisam]},{label:[$innodb],data:[$ninnodb]}]},options:{title:{display:true,text:'Database Schema Total Rows Graph',fontSize:20,}}}"); curl -L -o /vagrant/schema_change_mon/assets/db-${graph_ident}.png -g https://quickchart.io/chart?c=$(python -c "import urllib,os,sys; print urllib.quote(os.environ['url'])") Lastly, go to the module path directory or /etc/puppetlabs/code/environments/production in my setup. Let's create the file  manifests/schema_change_mon.pp. touch manifests/schema_change_mon.pp Then fill the file manifests/schema_change_mon.pp with the following contents, node 'pupnode16.puppet.local' { # Applies only to mentioned node. If nothing mentioned, applies to all. class { 'schema_change_mon': } }   If you're done, you should have the following tree structure just like mine, root@pupmas:/etc/puppetlabs/code/environments/production/modules# tree schema_change_mon schema_change_mon ├── files │ └── graphing_gen.sh └── manifests └── init.pp What does our module do? Our module which is called schema_change_mon does collect the following, exec { "mysql-without-primary-key" : ... Which executes a mysql command and runs a query to retrieve tables without primary keys. Then, $dbs.each |String $db| { exec { "mysql-duplicate-index-$db" : which does collect duplicate indexes that exist in the database tables. Next, the lines generate graphs based on the metrics collected. These are the following lines, exec { "generate-graph-total-rows" : ... exec { "generate-graph-total-len" : … Once the query successfully runs, it generates the graph, which depends on the API provided by https://quickchart.io/. Here are the following results of the graph: Whereas the file logs simply contain strings with its table names, index names. See the result below, root@pupnode16:~# tail -n+1 /opt/schema_change_mon/assets/*.log ==> /opt/schema_change_mon/assets/dupe-indexes.log <== pauldb.c.my_index(n,i) pauldb.c.my_index2(n,i) pauldb.d.a_b(a,b) pauldb.d.a_b2(a,b) pauldb.d.a_b3(a) pauldb.d.a_b3(a) pauldb.t3.b(b) pauldb.c.my_index(n,i) pauldb.c.my_index2(n,i) pauldb.d.a_b(a,b) pauldb.d.a_b2(a,b) pauldb.d.a_b3(a) pauldb.d.a_b3(a) pauldb.t3.b(b) ==> /opt/schema_change_mon/assets/no-pk.log <== pauldb.b, MyISAM pauldb.c, InnoDB pauldb.t2, InnoDB pauldb.d, InnoDB pauldb.b, MyISAM pauldb.c, InnoDB pauldb.t2, InnoDB pauldb.d, InnoDB Why Not Use ClusterControl? As our exercise showcases the automation and getting the database schema statistics such as changes or operations, ClusterControl provides this as well. There are other features as well aside from this and you don't need to reinvent the wheel. ClusterControl can provide the transaction logs such as deadlocks as shown above, or long running queries as shown below:   ClusterControl also shows the DB growth as shown below,   ClusterControl also gives additional information such as number of rows, disk size, index size, and total size.   The schema analyzer under Performance tab -> Schema Analyzer is very helpful. It provides tables without primary keys, MyISAM tables, and duplicate indexes, It also provides alarms in case there are detected duplicate indexes or tables without primary keys such as below,   You can check out more information about ClusterControl and its other features on our Product's page. Conclusion Providing automation for monitoring your database changes or any schema statistics such as writes, duplicate indexes, operation updates such as DDL changes, and many database activities is very beneficial to the DBAs. It helps to quickly identify the weak links and problematic queries that would give you an overview of a possible cause of bad queries that would lock up your database or stale your database. Tags: database automationmonitoringschema checkPuppet https://severalnines.com/database-blog/automate-database-schema-object-check
0 notes
k21academy · 4 years ago
Text
[AZ-400] Microsoft Azure DevOps Engineer
azure devops certification
What Is Azure DevOps?
Azure DevOps is a Software as a service (SaaS) platform from Microsoft that provides an end-to-end DevOps toolchain for developing and deploying software. It also integrates with most leading tools on the market and is a great option for orchestrating a DevOps toolchain.
At DevOpsGroup, we have lots of customers who have found Azure DevOps fits their needs irrespective of their language, platform or cloud.
Why You Should Learn About Azure?
DevOps practitioners are among the highest paid IT professionals today, and the market demand for them is growing rapidly because organizations using DevOps practices are overwhelmingly high-functioning. According to a very recent report published by Puppetlabs: State Of Devops Report, organizations using devOps approach deploy code up to 30 times more frequently than their competitors, and 50 percent fewer of their deployments fail.
In the last two years, listings for DevOps jobs on Indeed.com increased 75 percent. On LinkedIn.com, mentions of DevOps as a skill increased 50 percent. In a recent survey by Puppetlabs, half of their 4,000-plus respondents (in more than 90 countries) said their companies consider DevOps skills when hiring.
Benefits Of Azure DevOps?
Any language, any platform, any Cloud
Fully Integrated with end to end traceability
Easily build and push images to container registries like Docker Hub and Azure Container Registry.
Deploy containers to individual hosts or Kubernetes
to know more about Azure DevOps Certification Click here
0 notes
mackensen · 5 years ago
Photo
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
@cumaeansibyl​ and I spent ten days in Central Europe in 2019, and one of the places we visited was Lindau. We were staying in Basel and already planning to visit Bregenz, Austria (but that’s another story). The direct path to Bregenz was by a combination of Swiss and Austrian trains via St. Margrethen, along the south bank of the Bodensee (Lake Constance).
The day before, we were at a party and one of the locals told us that we could go via German trains to Lindau, and then take a quick ferry across the lake to Bregenz. The timing was comparable, and because of the Baden Württemberg pass was considerably cheaper. Why take two trains through two countries when you could take two trains and a ferry through three countries?
Thus it was that we found ourselves in Lindau, a charming city in Bavaria, Germany, on the north coast of the Bodensee and the only city on the lake with a lighthouse. There’s regular ferry service to various destinations in Germany, Austria, and Switzerland, plus plenty of trains in all directions.
But most important of all, there was a cheap snack shop at Lindau Hauptbahnhof, and Liz experienced her first doner kebab sitting on a bench at a German train station. As one does.
0 notes
digitalconvo · 5 years ago
Text
DevSecOps Market – Global Industry Analysis, Size, Share
DevSecOps Market Introduction
The word DevSecOps or SecDevOps is a combination of development, operations, and security and DevSecOps is a set of software development practices that are employed to improve the delivery features of a system, such as speed, flexibility, security, and scalability. DevSecOps is considered as an evolved version of DevOps, which stands for develop of software for information technology (IT) operations. As relying on traditional, control and ownership-based security techniques can lead to failures in the digital world, a mounting number of IT organization and businesses undergoing digital transformation as adoption DevSecOps services and solutions.
DevSecOps is one of the latest trends in the IT industry. DevSecOps focuses on implementing the security aspect in all the software development procedures, such as conception, deployment, implementation, and maintenance, rather than only focusing on the application security. Leading players in the DevSecOps market are shifting their focus on various parameters of digital businesses, especially cloud infrastructure and business agility and aiming to offer new levels of development capability, cost efficiency, and scalability for organizations of all sizes.
Get Brochure of the Report @  https://www.tmrresearch.com/sample/sample?flag=B&rep_id=4887    
DevSecOps Market – Notable Developments
In October 2018, IBM (International Business Machines Corporation) – an American multinational information technology company in the DevSecOps market – announced that it has entered a definitive agreement under which the company will acquire Red Hat, Inc. – an American multinational software company and also a leading player in the DevSecOps market – for approximately US$ 34 billion. This acquisition is likely to change the future of DevSecOps market dramatically, as IBM is likely to become the top hybrid cloud provider in the world post this acquisition. With the merger of two giants in the cloud business, the DevSecOps market is likely to undergo major transformation in the upcoming years.
In June 2018, ThreatModeler Software Inc. – a leading provider of automated threat modeling solutions in the DevSecOps market – launched its new threat modeling solutions – Threat-Modeling-as-a-Service (TMaaS) solution. The company announced that the unique “as-a-Service” benefits of TMaaS, such as flexibility of XaaS models and cost-efficiency.
In May 2018, Micro Focus International plc – a multinational software and information technology company in the DevSecOps market – announced the launch of its new IT Operations Management (ITOM) Platform. The company also declared that the ITOM Platform is the first ever IT operations platform that containerized microservices for the IT industry. The company further declared that the ITOM Platform integrates DevOps with AIOps, which can prove highly beneficial for large scale Hybrid IT environments as it improves the service delivery speed.
Some of the most prominent competitors operating in the competitive landscape of global DevSecOps market include –
IBM
Microsoft
Google
CA Technologies
Synopsys
MicroFocus
Dome9
Qualys
Entersoft
CyberArk
Splunk
Chef Software
Rough Wave Software
PaloAltoNetworks
4Armed
Algo Sec
Contrast Security
Aqua Security
Whitehat Security
Threat Modeler
Check Marx
Puppetlabs
Sumologic
Continuum Security
DevSecOps Market Dynamics
Emerging Growth of the SMEs Sector will Redefine the Target Customer Base in DevSecOps Market
Large enterprises are leveraging their investment capabilities to incorporate DevSecOps services for improving their business efficiencies and security services to dominate the DevSecOps market at this moment. However, despite the current dominance of top-tiered companies as a leading segment of end-user organizations in the DevSecOps market, small- and medium-sized businesses will emerge as a popular target customer base for DevSecOps vendors in the upcoming years.
Increasing number of SMEs across the world coupled with a mounting number of digitally operating SMEs is likely to complement growth of the DevSecOps market not only in developed but also in developing countries around the world. For example, in 2015, 1.87 million SMEs were registered in U.K. and by the end of 2016, a hundred thousand more start-ups were added to the list, according to a report by the European Commission. Also in the United States, nearly 30 million SMEs have been registered so far, according to the U.S. Small Business Administration.
To get Incredible Discounts on this Premium Report, Click Here @ https://www.tmrresearch.com/sample/sample?flag=D&rep_id=4887
Asia Pacific Region to Create Lucrative Sales Opportunities for DevSecOps Market Players
With the incremental industrial growth in the region, the Asia Pacific region is pegged to prove a highly profitable regional market for DevSecOps vendors in the coming future. Wide range of industrial sectors in the region are undergoing digital transformation with rapid advancements in technologies, such as Internet of Things (IoT), IT infrastructure services, and cloud computing. Amounting number of organizations and businesses have begun to adopt DevSecOps solutions and services in the Asia Pacific region, which creates a positive growth environment for DevSecOps market players.
Furthermore, the operations of at least 7 in 10 micro, small, and medium enterprises (MSMEs) in Southeast Asian countries are mainly dependent on labor-intensive services, according to the ‘ASEAN Small and Medium Enterprises (SME) Policy Index 2018’ report by Association of Southeast Asian Nations (ASEAN). Thereby, this is likely to create more lucrative opportunities for DevSecOps vendors to improve penetration and capture untapped in the Asia Pacific region, in the upcoming years.
DevSecOps Market Segmentation
Based on the deployment of DevSecOps, the DevSecOps market is segmented into,
Cloud
On-premises
Based on the size of the organization, the DevSecOps market is segmented into,
Large Enterprises
Small- and Medium-sized Enterprises (SMEs)
Based on the components of DevSecOps, the DevSecOps market is segmented into,
Services
Solutions
Based on end-user sectors, the DevSecOps market is segmented into,
Banking, Financial Services and Insurance (BFSI)
Information Technology (IT) & Telecommunications
Media & Entertainment
Retail & Consumer Goods
Government & Public Sector
Manufacturing
Healthcare & Life Sciences
Energy & Utilities
Request For TOC @ https://www.tmrresearch.com/sample/sample?flag=T&rep_id=4887
About TMR Research:
TMR Research is a premier provider of customized market research and consulting services to business entities keen on succeeding in today’s supercharged economic climate. Armed with an experienced, dedicated, and dynamic team of analysts, we are redefining the way our clients’ conduct business by providing them with authoritative and trusted research studies in tune with the latest methodologies and market trends.
0 notes
Text
DevSecOps Market by Component, Deployment Type, Organization Size, Vertical, and Region - Global Forecast to 2023
Tumblr media
According to market research report "DevSecOps Market by Component (Solution and Services), Deployment Type (On-premises and Cloud), Organization Size, Vertical (BFSI, IT and Telecommunications, Manufacturing, and Government and Public Sector), and Region - Global Forecast to 2023", The DevSecOps market size is expected to grow from USD 1.5 billion in 2018 to USD 5.9 billion by 2023, at a Compound Annual Growth Rate (CAGR) of 31.2% during the forecast period.
The growing need for highly secure continuous application delivery and the increased focus on security and compliance are the major growth factors for the DevSecOps market.
Browse and in-depth TOC on “DevSecOps Market”
65 - Tables
37 - Figures
148 - Pages
Companies with DevOps framework adopt DevSecOps framework to deliver higher levels of security and efficiency in their applications
DevSecOps solutions provide customers with the required set of tools, enabling the security teams to efficiently align with the DevOps team and deliver the required security changes, resulting in continuous monitoring of attacks and defects. Companies with a DevOps framework are adopting the DevSecOps framework to deliver higher levels of security and efficiency in the applications being built. This results in the presence of a higher level of security at every stage of the software delivery life cycle, enabling clients to experience reduced compliance costs and achieve faster application release and delivery.
Download PDF Brochure@ https://www.marketsandmarkets.com/pdfdownloadNew.asp?id=122458587
Cloud deployment type would help in enabling enhanced infrastructure scalability and performance
Cloud deployment of DevSecOps solutions helps organizations deploy their codes in their production process, along with enhanced security, performance, and scalability. The cloud deployment type offer organizations with benefits, such as increased scalability, speed, 24/7 service, and enhanced IT security. DevSecOps practices would help reduce the Operational Expenditure (OPEX) of organizations using process standardization and automation processes that come with complete control and availability of an environment based on users’ needs.
DevSecOps solutions would help deliver applications faster and efficiently and reduce vulnerabilities and attacks to a huge extent
The adoption of DevSecOps solutions is higher among the large enterprises segment, and the trend is expected to continue during the forecast period. The rising demand for faster development and release cycles, as well as, the need for early integration of security tools into the DevOps processes provide large enterprises with various benefits.
Speak To Analyst@ https://www.marketsandmarkets.com/speaktoanalystNew.asp?id=122458587
APAC is expected to dominate the global DevSecOps market during the forecast period
The global DevSecOps market by region covers North America, Asia Pacific (APAC), Europe, Middle East and Africa (MEA), and Latin America. APAC has a favorable market for the DevSecOps solution and service vendors. The region has major developing economies and a broad customer base for many industries. Due to the huge customer potential of this region, organizations around the globe want to set their footprint here. Due to these factors, the adoption of DevSecOps solutions and services would prove to be beneficial for enterprises, as they can enjoy the benefits of shorter development life cycles, improved operational efficiency, enhanced automation, and reduced costs.
The report also studies various growth strategies, such as mergers and acquisitions, partnerships and collaborations, and developments, adopted by the major players to expand their presence in the global DevSecOps market. The major vendors in the global DevSecOps market include CA Technologies (US), IBM (US), MicroFocus (UK), Synopsys (US), Microsoft (US), Google (US), Dome9 (US), PaloAltoNetworks (US), Qualys (US), Chef Software (US), Threat Modeler (US), Contrast Security (US), CyberArk (Israel), Entersoft (Australia), Rough Wave Software (US), Splunk (US), 4Armed (UK), Aqua Security (Israel), Check Marx (Israel), Continuum Security (Spain), Whitehat Security (US), Sumologic (US), Puppetlabs (UK),and Algo Sec (US).
About MarketsandMarkets™
MarketsandMarkets™ provides quantified B2B research on 30,000 high growth niche opportunities/threats which will impact 70% to 80% of worldwide companies’ revenues. Currently servicing 7500 customers worldwide including 80% of global Fortune 1000 companies as clients. Almost 75,000 top officers across eight industries worldwide approach MarketsandMarkets™ for their painpoints around revenues decisions.
Our 850 fulltime analyst and SMEs at MarketsandMarkets™ are tracking global high growth markets following the "Growth Engagement Model – GEM". The GEM aims at proactive collaboration with the clients to identify new opportunities, identify most important customers, write "Attack, avoid and defend" strategies, identify sources of incremental revenues for both the company and its competitors. MarketsandMarkets™ now coming up with 1,500 MicroQuadrants (Positioning top players across leaders, emerging companies, innovators, strategic players) annually in high growth emerging segments. MarketsandMarkets™ is determined to benefit more than 10,000 companies this year for their revenue planning and help them take their innovations/disruptions early to the market by providing them research ahead of the curve.
MarketsandMarkets’s flagship competitive intelligence and market research platform, "Knowledgestore" connects over 200,000 markets and entire value chains for deeper understanding of the unmet insights along with market sizing and forecasts of niche markets.
Contact:
Mr. Sanjay Gupta
MarketsandMarkets™ INC.
630 Dundee Road
Suite 430
Northbrook, IL 60062
USA : 1-888-600-6441
Content Source:
https://www.marketsandmarkets.com/PressReleases/devsecops.asp
0 notes
pdxpedpow · 7 years ago
Photo
Tumblr media
The people @lifeatpuppet have a relatable sense of humor😂🤣😂 Shout out to whoever did this👌 #puppetlabs #softwaredeveloper #art #artist #foodiegram #octopus #wedeliver #wecaterpdx #riderwill #oneworld https://ift.tt/2FEx614
0 notes
stefanstranger · 7 years ago
Text
RT @bgelens: New blog post on using DSC with Puppet. This time about notifications! https://t.co/3bIQ7MmPN1 #PSDSC #puppetlabs https://t.co/dbcRk87gCY
New blog post on using DSC with Puppet. This time about notifications!https://t.co/3bIQ7MmPN1 #PSDSC #puppetlabs pic.twitter.com/dbcRk87gCY
— Ben Gelens (@bgelens) February 1, 2018
from Twitter https://twitter.com/sstranger February 02, 2018 at 05:08PM via IFTTT
0 notes
alttheatrenyc · 8 years ago
Link
via Twitter https://twitter.com/AltTheatreNYC
0 notes
ebizworldwide · 8 years ago
Text
DevOps, Culture Change and the Brass Ring of Velocity
The globe of technology is except the pale of heart. It can be high-stakes as well as the margin in between wild success and also the other day's news is razor thin at times.
In order to remain one step in advance of the competition in the battle for market share, several technology business have actually begun to essentially shift just how they operate in order to raise the velocity with which they build, test and also launch software. What originally started from the agile motion is now evolving right into a new philosophical means of functioning: DevOps.
DevOps is not a substitute of nimble or lean methods even a supplement to them. It fills in the gaps to assist technology companies break down useful stovepipes, automate as much as possible in the spirit of rate and also quality, and fine-tune functional procedure to enable for speed that was unusual 10 years ago.
The thought of a technology business deploying ten updates to an app in a day was preposterous just a couple of years back. Currently, those who have made the change to a DevOps environment are doing this on a normal basis.
It’s a culture thing.
As a business psychologist that concentrates on workplace society, I locate the topic of DevOps an interesting situation research. Technology business are now testing their enduring ideas and assumptions and also moving deeply held ways of functioning to permit considerably better cooperation and velocity.
As I continuously take into consideration DevOps, at its core, as a society change, I cannot aid yet start to frame my own individual ideas about what DevOps is and also isn't really. Right here's exactly what I've thought of until now:
DevOps is less about just what we do and also a lot more about just how we do it. Modern technology framework and also progressing procedures are important in successfully transforming an organization to DevOps principles. However, at the end of the day, DevOps has to do with how job obtains done, and also exactly how individuals communicate with each various other as well as with modern technology to drive performance.
DevOps is not an off-the-shelf option that could merely be implemented. Sadly, there is no one-size-fits-all technique for just how DevOps ought to be carried out. Again, given that DevOps is basically a social change in the means job gets done, it will certainly take somewhat various forms based upon the organization. That's okay. Chef's CTO, Adam Jacob, does a phenomenal task expressing this nuance in his 2015 write-up in ReadWrite.
The individuals side is vital to the equation. While the majority of the DevOps posts I have actually encountered do make passing mention of individuals side of the DevOps equation, extremely few exceed a brief affirmation that it's essential prior to moving on to the process as well as technology/infrastructure components of the improvement. If we truly wish to alter the method individuals function to drive speed in the tech globe, then we need to take a further appearance at this human side of DevOps, and the ways it could support or derail a sustainable DevOps transformation.
DevOps is not a task, it's everyone's job. A quick work search on Indeed.com included only to the Seattle, WA area located over 500 job openings with the term DevOps in the title. Klint Finley's Wired write-up highlights this fad of tech companies altering job titles to include DevOps. Yet, baseding on Finley, DevOps is not a work. It's a vital way of working with each other to drive performance. If this is real (which I happen to completely concur with), then I will go an action even more to suggest that DevOps isn't really one individual's task, it's everybody's job.
The people part of the version appears to be the component that is the very least defined. I've located several DevOps designs in my research, but none have actually fairly struck residence with me. A lot of make reference of individuals or culture element of a DevOps change, yet they often tend to do so in name just, focusing much more heavily on the facilities components. It's important to realize that even more "stuff" is not the remedy to a successful DevOps shift, or any sort of cultural transition for that issue. I dive deeper right into this idea in one more current article.
A DevOps transition could be likened to any sort of business adjustment in the method job obtains done, whether that be organizations relocating toward a client-focused society, those wanting a culture that stands out at high quality and uniformity, or those that are working toward operating at greater speed. If this is true, after that concentrating on the basic aspects of business society that drive the behaviors one needs to execute on their business technique becomes the coatrack where all the various other systems, process and people adjustments hang.
There is clear evidence to recommend that a DevOps method to technology development could have significant effect on the speed of an IT company. This is sustained in the State of DevOps Record generated by PuppetLabs on an annual basis.
That stated, there is likewise data that recommends that a lot of DevOps improvement efforts cannot supply on assumptions. The existing culture of the IT organization does not enable individuals to act in the methods necessary making the jump to DevOps. Based on this, I send that it's time that we review our approach to sustainable DevOps makeover. We have to begin to create a more extensive technique that considers the different people as well as cultural aspects that assist enhance new methods of working.
The trophy of rate is accessible for these forward believing tech firms. The concern is, could they welcome the business adjustment needed to get hold of it?
0 notes
tnoda-clojure · 11 years ago
Link
「Clojure は商用サービス採用事例が少ない」というのは過去の話です.
Puppet が Ruby から Clojure に移行
DevOps 自動化ツールで有名な Puppet は Ruby 製でしたが, その一部は既に Clojure で書かれていました. で,今回の発表は,「一部だけでなく Puppet master を置き換える Pupper Server をリリースするけど,それは Clojure で書きました」 という発表です.ちょっと驚きました.とはいえ, Puppet Labs で Clojure 用のアプリケーションフレームワーク, TrapperKeeper を作っていたので, ある程度は予想できていたのですが.
これから,「Clojure で書かれた製品は?」と聞かれたら Puppet Server と答えましょう.
赤い Ruby より 3 倍速い緑の Clojure
記事中に Ruby 版 (Apache/Passenger) と Clojure 版 (TrapperKeeper) とのベンチマーク比較が掲載さています.ざっくり言うと, Clojure 版は Ruby 版より 3 倍早くなっているようで, Clojure に移行した成果が早速性能面で出ているようです.
これを見て,「Clojure 速い,Ruby 遅い」と思われるかもしれませんが, 私は,「Clojure の 3 倍遅い程度で済んでいるということは, Ruby って以外と性能悪くないです���ね」と思ってしまいました.
だって,Clojure 速いですから.3 倍程度の差なら別に Ruby のままでも良かったのではと思うのですが,それでも, Clojure を使う理由はきっと別のところにあるのでしょうね.
Clojure で Ruby を使う
「Clojure で Ruby を使う」と書いてしまうと, 個人のブログ記事か Qiita のエントリにありがちな技術ブログ記事タイトルに思えるのですが, Puppet Server では Puppet master との互換性を保つために, Clojure で JRuby のインスタンスを作って,その中で,旧 Puppet mster の Ruby コードを動かしているそうです.このおかげで,Puppet master の Ruby コードを Clojure で書き直すことなく, 既存の資産を活かしつつ Clojure で機能拡張できるという,いいとこどりをしています.
これも,Clojure の Java 親和性の高さと JRuby の完成度の高さのおかげですね. 既存 Ruby プロジェクトを Clojure に移行する際の方式として参考になります.
おわりに
Ruby 製の Puppet master が Clojure に移行して Puppet Server になるそうです. JRuby + Clojure という組み合わせにより,
性能向上 (Clojure + JVM = 3 倍速い)
既存資産の継承 (Clojure + JRuby)
の両取りに成功しました.Clojure の採用事例は,Clojure でのスクラッチ開発より, こういった既存の Ruby プロジェクトからの移行により増えていくのかもしれません.
個人的には Salt いいなあと思っていたのですが,Puppet Server に戻りそうです.
16 notes · View notes
lkanies · 13 years ago
Text
New bike bag - Mission Workshop Sanction
I've been using Chrome messenger bags for about ten years, and I've really liked them.  Mine have been all over the world with me, and at least one of them still looks nearly brand new.  However, my riding these days is basically straight commuting between home and the office, so I rarely need the on-the-bike access that messenger bag gives me.  
Even worse, my left shoulder, which bears the whole bag, has recently decided it hates my guts and would like me to die, or at least share the load across both shoulders.  As much as I like a night on the couch with bourbon, painkillers, and a hot water bottle, I can't afford to get into the habit of that kind of thing.
So, I went to start looking at new bags.  Because I live in PDX, I need something waterproof; because I live in Portland but travel to San Francisco, Palo Alto, NYC, London, and more, I need something that looks good; and because I travel a ton and hate overpacking, I need something small (where my definition of small is I can *barely* fit my daily needs plus an extra shirt/socks/etc for one night away).  All that, plus of course it should be comfortable and last 10 or 20 years.  I've also been trying hard to buy things that don't look quite as plasticky as most, um, everything looks these days, so extra points for natural fibers or something that does a good job of faking it.
I looked at quite a few other bags, but everything seemed to fail really quickly on either 'look good' or 'be small', and many failed on both.  It was obvious right away that I had to give up on buying anything that didn't look plasticky - everything was plastic or nylon, and really obviously so.  One of the other things about most of the bags I found was that they tended to be more like gear and less like everyday bags, and they tended to be very complicated with lots of straps and pockets and things.  This could be done well, but in most cases (heh heh) the bags were either too complicated, with lots of straps and extra pockets, or too simple, with just a big middle pocket and little to no organization.
I decided I had to see a bunch of bags in person, so I set out to some shops in town with my then-day bag full of everything I carry with me every day and a bit of extra clothing to simulate an overnight trip (yes, I pack light).  My first stop was River City Cycles, a fantastic shop for commuters, and that's where I ended up buying my bag.  I was able to try my top contenders there (from Chrome and Timbuk2), but the Chrome bags were, um, gigantic, and the Timbuk2 bags were just to plasticky for me.
The first time I looked at the Mission Workshop Sanction, it looked both huge and plasticky.  However, after trying everything else, I decided to actually load my stuff in one before I left.  Then I noticed that they had two varieties - one made of nylon (linked above) and one made of waxed canvas, which I adore.  It wears kind of like leather, in that it looks better with age, but it's about as waterproof as you can get without plastic bags and has a great feel.
After loading everything into this bag, I realized it was a lot smaller than it looked (it helped that I had tried some truly big bags prior to this).  In fact, it was kind of too small for my torso, given how the straps worked out.  However, I wore it around the shop for awhile, and based on the fact that it was the last one they had in the gray color I liked (along with River City's great return policy), I decided to buy now and try it over the weekend.  In case it's not obvious, I kept the bag after all.
Here, roughly, is how I tested it out.
First, my daily load (with an extra jacket just in case) outside of the bag.  It's mostly laptop, ipad, a cable bag or two, and a bunch of little things:
Then the bag actually loaded up.  This is how I'd use the bag 95% of the time, so it's the most important test:
Here's how the bag looks open, so you can see where it all goes:
And you can see when it stands up that it's really not very full:
I also tried it with a full load of overnight clothing, in an Eagle Creek clothing folder (which I really only use on overnight trips):
You can see that the bag is pretty round with this in it.
I also took a shot that showed two aspects of the bag I wasn't so fond of, and one nice feature:
My cat hair sticks to the bottom, and the bottom part of the straps are too long (I've bound mine with rubber bands).  I do like the loop for attaching a bike light, though.
I was able to fit everything into the bag in all cases, but I barely fit the overnight kit (and in fact, on my first actual overnight trip with the bag, I had to leave some stuff behind, which I consider a feature).  The bag also never got too round or uncomfortable, and after going on a few walks and bike rides, I was comfortable that this was the right bag.
I looked into the bag a bit more (yes, I'm a bit maniacal on my research), and it turns out this waxed canvas version is a special version, so it turns out that one of the best features of this bag is that no one else in my office can have one, because it's all sold out. :)  Yes, the Puppet Labs office is full of bike and gear nuts, with a strong overlap on people who pack really light.  There is another special version of the Sanction, though, that might be worth checking out, and the normal version is still a great bag.
After using the pack for a month, including on an overnight trip, I'm still happy with it.  My only real complaint is that it doesn't have a handle on the side, which I would like for when I can't put it on my bag.  I didn't expect to use the "special" laptop compartment, because it's just a separate pocket behind the main pocket, but it ends up being really convenient - the main pocket can be packed full, but I can still slide my laptop in and out of its pocket.  It also turns out that the little pocket in front perfectly fits my ipad, which is an added bonus.
7 notes · View notes
mattdevuk · 11 years ago
Text
July 21st-23rd Update? :D
Okay so I've been a bit behind with these updates. It's hard to do so when I don't have my apartment to live in with decent internet :(
Luckily there's not too much to update, mostly been focusing on setting up the PuppetDBQuery module and working out how best to use it in our scenario. That seems to be done and working now. One thing I did notice, which I believe to be an issue with PuppetDB, was that overnight our Puppetmaster process increased it's RAM usage consistently until bu the morning it was high enough to stop itself from successfully updating it's own catalogue. After stopping the process and letting it release it's resources it seemed to keep itself under control again and right up until now, 10PM the following night, it seems to be working fine so maybe it was just a one-off glitch but will have to keep it in mind in the next few weeks.
Not related to either of the above points, but one issue had to deal with lately was the particular syntax for selecting an array item with Puppet.
If it is being used in an ERB template the correct format is <%= @array_name [0] %> The important part being the space between the array name and the index. If this space does not exist, then the parser will not be able to read the variable and will leave it blank.
If the variable is being used via string interpolation, the correct format is $variable="${arrayname[0]}". The important part being the lack of the space between the array name and the index. I found, especially when using Puppet's host resource, that if you put a space, it will fail to read the variable properly and leave it empty, throwing an error if used as the ip property.
The final scenario I came across, was if you were using string interpolation to generate another array. this one sounds a bit complicated so I'll show an example.
$array1=["a","b","c","d","e"] $array2=["${array1[0]}","${array1[2]}","${array1[4]}"]
Is this situation, it doesn't matter if there is a space or not between the array name and the index.
These aren't complicated things to remember but if you don't know about them and suddenly after an update everything's going crazy and you're staring at one of them going "but it looks perfectly fine" you'll kick yourself if it comes down to a simple space in the wrong place!
I don't think there's anything more to update you guys on.
I will say, that tomorrow (Thursday 24th) there is a webinar conducted by PuppetLabs to explain and explore the new features in Puppet Enterprise 3.3 - Link
There is also a webinar by Atlassian tomorrow too, about using Git for Continuous Integration - Link
1 note · View note
bookertrex · 12 years ago
Photo
Tumblr media
1 note · View note
marcosortiz · 13 years ago
Text
PuppetConf 2012 re-cap
http://puppetconf.com/blog/puppetconf-2012-crowdsourced-highlights 10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS... CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION http://www.uci.cu http://www.facebook.com/universidad.uci http://www.flickr.com/photos/universidad_uci
1 note · View note