cwfrazier
cwfrazier
Chester Frazier
44 posts
CEO, Frazier Industries, [email protected]
Don't wanna be here? Send us removal request.
cwfrazier · 4 years ago
Photo
Tumblr media
at Ocean Springs, Mississippi https://www.instagram.com/p/CWjmAkpMZxQ/?utm_medium=tumblr
0 notes
cwfrazier · 5 years ago
Text
What products are you really grateful for?
I published What products are you really grateful for? on Medium.
0 notes
cwfrazier · 5 years ago
Text
What products are you really grateful for?
I published What products are you really grateful for? on Medium.
0 notes
cwfrazier · 5 years ago
Text
What products are you really grateful for?
I say this to people all the time but if you had to go through life with cerebral palsy, there's never been a better time to do it just because this era has opened up so many different possibilities to people with disabilities that simply weren't possible before. If you look at my previous job at California Resources Corporation, somebody who is unable to drive being able to make $31 per hour, full time without even leaving his apartment was almost unimaginable even fifty years ago, much less a hundred or even two hundred years ago. Hell, I think I only talked to my colleagues over there twice on the phone and never even actually met them in person. Everything was conducted either by text or email.
I think that the product that has surprised me the most in recent years has been iPads. I had bought myself two or three of them in the early years and never really actually used them. I either ended up just sticking them in a drawer or giving them away to somebody. But a few years ago I somehow transitioned where my iPads are typically my primary devices. I'm on one from the time I wake up in the mornings to the time I go to bed. Although I still need my Thinkpad and a computer for things like programming, I can usually do everything on an iPad.
I remember in my later years in elementary school, somebody suggested that I start scanning in my math worksheets so I could do them on the computer. So my parents went to CompUSA and bought a scanner for a hundred or two hundred dollars and we hooked it up. But it turned out to be so impractical. First we had to scan each sheet one at a time since it was a flatbed. And scanning usually took a minute or two or three depending on the resolution. Then the real problem that we ran into was the lack of software (to be fair, it may have existed and we just didn't know about it) to manipulate the scanned documents. This semester, all we did with my math homework was pull out my phone, used Adobe Scan (free) to scan everything in, save it as a PDF and then from there could use GoodNotes or any variety of apps to type on the document or use my Apple Pencil on it. How far we've come is crazy for such a short period of time.
Another area that I'm very impressed with for maturing so quickly is the area of cloud computing. In high school, we used to host Maricopa’s website on an actual “server” on campus. Then in the early days of my business I was renting an actual server from a company called Server Intellect that was about $250 per month that hosted websites, email servers, etc. Then I discovered Google Apps (G-Suite now) for email and then Amazon Web Services for everything else. As I recall, I went from $250 per month for a single server to two AWS EC2 instances that were more powerful for something like $80 per month. Again, I still remember having to buy actual servers, now we can just spin them up at will and get charged pennies per hour for their uptime.
0 notes
cwfrazier · 5 years ago
Text
My Backup System — Evolved
I published My Backup System — Evolved on Medium.
0 notes
cwfrazier · 6 years ago
Text
Moving Forward
I published Moving Forward on Medium.
0 notes
cwfrazier · 6 years ago
Text
The way my IT and Development Background Fuels my Approach to Problem Solving
I published The way my IT and Development Background Fuels my Approach to Problem Solving on Medium.
0 notes
cwfrazier · 8 years ago
Text
Linux: Transferring Your Windows 10 Entitlement License to a Virtual Machine
The very first thing that I did when I got my new Thinkpad in a few months ago was of course, format the hard drive and put Ubuntu on it without hesitation.
The very first thing that I did when I got my new Thinkpad in a few months ago was of course, format the hard drive and put Ubuntu on it without hesitation.
However, I am now working on a project that requires me to run Windows for Visual Studio. Since I refuse to run Windows as my host, I went ahead and spun up a Windows 10 virtual machine in VirtualBox but I still had to activate it. Like most machines bought within the last few years, they come with a Windows 10 license as an “entitlement”.
So to transfer my entitlement license from the Thinkpad was fairly easy. In your Linux terminal, simply type: sudo cat /sys/firmware/acpi/tables/MSDM and it will display your Windows 10 license key in plain text that you can then plug into your virtual machine so it can activate.
0 notes
cwfrazier · 8 years ago
Text
Upgrading My Personal Digital Security
With all of the data breaches happening such as Equifax, the update on the Yahoo breach that happened a while back, etc, it’s been on my project to do list to go in and work on my own personal security.
With all of the data breaches happening such as Equifax, the update on the Yahoo breach that happened a while back, etc, it’s been on my project to do list to go in and work on my own personal security.
I used to be really good at protecting myself online but you know, you get busy and/or lazy, start using the same password everywhere, disabling two factor because it’s a pain and so on.
LastPass
So where did I start? The very first step I took was to dust off my LastPass account and go through all of my online services and replace all of my passwords with randomly generated passwords (up to 100 characters where accepted) from the service.
Then it was time to secure LastPass. Since it now contains all of my passwords for all of my accounts, the last thing I want is someone to gain access to my account. For that, I ordered myself a new Yubikey. For those of you that don’t know, a Yubikey is a small usb device that looks like a usb flash drive but instead, every time you hit the button on the Yubikey, it omits a 44 character, one time password that the service then checks with the Yubico servers to verify that the code is authentic. So as a consequence, in order to get into my LastPass account, you need my username, password as well as physical access to my Yubikey.
Two Factor Authentication
The second thing I did was to turn on second factor authentication wherever possible.
What I found was a lot of services only offered two factor authentication via SMS which isn’t exactly the best way to implement two factor since it’s been proven that text messages can be sniffed out of the air and read, however it’s definitely better than no two factor at all.
Where it was offered, I turned on two factor authentication via Yubikey. Very few services offered it as an option but it was great to see that Google, Facebook and Dropbox all have it as an option.
Amazon Web Services
Most of my servers and databases are hosted with Amazon Web Services. I was fairly surprised that they don’t support YubiKeys as multi-factor authentication. So instead, I ordered one the devices that they recommend from Gemalto called a Safenet Display Card. It’s a credit card sized device that generates a six digit pin when you activate it. Once again, now in order to gain access to my AWS account, you’ll need my username, password and access to the display card.
Backup Codes
When you turn on second factor authentication, most services give you a list off “backup codes” that you can use to override the second factor device just in case your device gets destroyed or lost.
What you’re SUPPOSED to do is actually print the codes out and store them in a safe place. But since I absolutely LOATHE paper, I stored them all in a text file, put them all onto a flash drive and made arrangements with my best friend (who lives in an entirely different county) to physically store the flash drive in a fire/water proof safe in her apartment. This way the codes are entirely offline and are protected against nature disasters.
Local Security
This section I particularly went crazy with, mostly because I wanted to ensure that if my laptop ever got stolen, it would be completely unusable to the thief.
Since I use Ubuntu, it offers two different ways to encrypt your data.
Full Disk Encryption
When you install most distributions of Linux, they give you the option to encrypt the entire hard drive. So when you boot up the machine, before you even get to the username and password prompt, you need to enter a password to decrypt the hard drive. With the Yubikey, you can program the second “slot” to store a static password up to 38 characters. So that’s what I did and used that static password as the entire hard drive decryption key. Mostly because I’m lazy and didn’t want to enter multiple passwords whenever I turn on my computer or reboot.
Home Directory Encryption
Most Linux distributions offer to encrypt just your home directory where people store the majority of their documents.
This is usually the easiest way to do it just because it uses the password for your local account as the decryption key.
I went ahead and turned that on as well so the files in my home directory are double encrypted, once by the full disk encryption and once by the home directory encryption.
External Hard Drive Encryption
I always have a four terabyte external hard drive connected to my laptop for all my big files.
In Linux, you can have a drive formatted to be fully encrypted with the LUKS algorithm requiring you to enter a password when you connect the drive in order for the data to be decrypted. Yep, turned that on.
BIOS
The very first thing I changed in the BIOS was to change the boot order from CD/DVD then USB device then hard disk to hard disk first so someone can’t boot from a live DVD or flash drive.
Then I disabled the boot menu option so someone can’t change the boot order without going into the BIOS.
Of course I then put a password in the BIOS so someone can’t change anything on it without the password.
Lastly, not every BIOS has this option but my Thinkpad does, but a tamper detection mechanism. So that whenever a hardware change is detected, you must confirm the change in the BIOS for it to boot, which of course requires the BIOS password. This makes it where even if the thief is smart enough to take out the hard drive and put a different one in, it still requires a password making the laptop completely unusable.
Conclusion
Securing your data is absolutely a pain in the ass. However, we live in a time where it is now a must. Would you rather be slightly inconvenienced now or wait until your identity is compromised?
0 notes
cwfrazier · 8 years ago
Text
Why I Switched from Digital Ocean and Wordpress to Squarespace: A Good Life Lesson
So mid-this summer I decided that I needed to have a permanent home on the web, especially somewhere where people could go and find out about my professional life and not just read my political views or details about my personal life.
So mid-this summer I decided that I needed to have a permanent home on the web, especially somewhere where people could go and find out about my professional life and not just read my political views or details about my personal life.
I’ve ran blogs for myself ever since my teenage years but you guys know how that typically goes, it’s cool for a week and then you forget about it. But I’ve always used Wordpress for myself just because I liked having control over the server. So that’s what I did, I went on Digital Ocean, created myself a CentOS droplet, popped Wordpress on it and good to go!
Not really.
As I started to add more and more content, it began crashing more and more frequently. As it had turned out it was a problem uniquely tied to Digital Ocean and one of the Wordpress plugins I was using. However, the last time it had went down, I had just sent my resume out to several potential employers — with my website listed. I had already brought it up from going down once that day so the second time, I was pretty much done. After all, I’m a tech guy! I didn’t want potential employers trying to visit my site and it be down! I could just imagine, “We’re not going to hire him! He can’t even run a website!”
So I immediately grabbed my debit card, went over to Squarespace, signed up for an account, paid it for the year and within a few hours had everything transferred and back up and running. To be fair, I’ve had a lot of experience with Squarespace in the past. It’s been my go to web platform for my IT clients for years, mostly because I could go in and set everything up initially, spend a few hours with the client and show them around and they could make changes to their site themselves without having to call me.
But my point is this: save your energy for the most important work. Hell, I have web servers, application servers, database servers, caching servers, VOIP servers, Active Directory servers, etc that I’ve been managing and running 24/7 for years with very little downtime. So sure, I could have went over to AWS, spun up a few small instances, placed them behind a load balancer, blah, blah, blah and STILL have to be the one who manages it and have to be the one who fixes it when it has problems.
Sometimes it’s not worth the time or the energy just to be able to say you did it all yourself. I’m not sure if it’s because I’m getting older or what but I’m beginning to start picking my battles more carefully.
0 notes
cwfrazier · 8 years ago
Text
My Ridiculous Source Code Backup System
One of the most frustrating things in the world being a developer is losing code. So there’s very few things that I make sure to take extra care in than source code backup.
One of the most frustrating things in the world being a developer is losing code. So there’s very few things that I make sure to take extra care in than source code backup.
Of course I use version control via subversion on a service called Beanstalk. Yes, I know that GIT is better, I just grew up on subversion.
First off, when I check out my repositories on my local machine, I have the repository directory nested in my Dropbox Pro folder. I have Extended History turned on in my Dropbox which means it keeps each and every version of each file for a year versus the standard thirty days.
Now comes the post-commit process. On every single commit, the following occurs:
Gets checked into subversion (with commit message required)
Automatically deploys on development server
Then on my local machine:
Tars the entire project and places it in a temp directory
Copies the project to my Google Drive
Copies the project to Dropbox
Copies the project to my 4 terabyte external hard drive
Copies the project to an Amazon S3 bucket in California
Copies the project to an Amazon S3 bucket in Virginia
Copies the project to an Amazon S3 bucket in Tokyo that has a lifecycle rule applied to automatically retire all files in that bucket to Amazon Glacier after 24 hours
Removes the tar archive from the temp directory
I realize that it’s a tiny bit excessive but I’m not lying when I can’t stand losing code.
0 notes
cwfrazier · 8 years ago
Text
Why I ditched MySQL and put Bob on DynamoDB
Over the past few years, I have all but given up on using MySQL whenever I need to write a database, just because I don’t like having to be careful about how many queries per second I can conduct without worrying about how much load the database server can handle at once and I have never liked the Oracle licensing arrangements.
Over the past few years, I have all but given up on using MySQL whenever I need to write a database, just because I don’t like having to be careful about how many queries per second I can conduct without worrying about how much load the database server can handle at once and I have never liked the Oracle licensing arrangements.
When I first started working on Bob years ago, I meant it to only be ran off of a single Raspberry Pi 3 which worked well for a while back when all Bob was doing was sending me a text message every eight hours and notifying everyone if I didn’t respond. During that time, the Raspberry Pi was serving as both the web server (Apache) as well as the database server (MySQL) which worked great at the time. However, as I started adding more and more functionality to Bob such as location tracking, social media checks, etc the MySQL service on the Raspberry Pi would crash, but even worse, it would silently crash so I could go a few days without noticing it was down. Not exactly what you want from a program that is supposed to be monitoring your life 24/7.
I eventually worked around the issue by lightening the load on how much data it stored and how often the scripts queried the data but it was a half ass fix.
So last month, when I decided to seriously work on Bob again, the very first decision I made was to ditch MySQL, and overhaul the backend to run exclusively on Amazon’s DynamoDB.
Why DynamoDB?
First of all, I’ve always been a huge fan of Amazon Web Services. Secondly, it’s a complete unmanaged solution. You create the tables and add the data and Amazon manages the rest.
When you create your tables, you specify how many reads and writes per second that each table needs to perform at and Amazon automatically spreads your data across how ever many servers that’s needed to support the specified throughput (we’ll come back to this).
By default, all tables only run off of solid state hard drives making it incredibly fast.
No Licensing Fees
Although it’s not open source, there are no licensing fees to use DynamoDB, you only pay for the hardware consumption that you provision per hour. For instance, if you know that your application will be heavily used during business hours during weekdays, you can provision to have more throughput during those hours and only get charged for those hours. Which brings me to my favorite feature of DynamoDB, auto scaling.
Auto Scaling
As I mentioned before, when you setup your tables, you get to specify how many reads and writes per second you want each table to handle but the truly beautiful part is its completely dynamic meaning you can adjust it throughout the day.
With old database models, you would typically have to think of your maximum expected capacity and run at that maximum capacity 24/7. With DynamoDB, you can specify a minimum and maximum read and write capacity and it will automatically scale up or scale back down based on usage.
For example, I have all of my tables set with a minimum read and write capacity 5 per second and a maximum of 10000 and have a rule where if at anytime, if 50% of my capacity is being used, double my capacity up until 10000.
What does this mean for Bob?
The more data we can collect, the more accurate algorithms can be.
Let me give you one example, on my personal account I have my computers reporting to Bob my usage based on mouse movement. When I had MySQL powering the backend, I had to build in a sleep mechanism where when it detected mouse movement, the computer would report it to Bob and then put itself to sleep for sixty seconds because otherwise, it would try to report to Bob multiple times per second and eventually overwhelm the database server. Now we can collect data up to milliseconds instead of minutes.
When you think of everything that’s either putting data into Bob, or taking data out: everything from computer check ins to motion sensor data to scripts that run every minute, on the minute 24/7, you start to see why MySQL started getting so overwhelmed.
So with the database bottleneck almost completely removed, I look forward to throwing as much data as possible into Bob!
0 notes
cwfrazier · 8 years ago
Text
Amazon Glacier: Great for Data Archiving or Last Resort Backups...But Nothing Else
As most people know, I’m a digital hoarder. I never delete anything. I have around 4.6 terabytes of data stored in my Google Drive alone. That’s cool and all but it becomes interesting when I start looking at backup solutions for my data. One of the best solutions out there (in my opinion) for data archiving and data backup is yet another product from Amazon Web Services called Amazon Glacier.
Glacier let’s you store your data securely for a whopping $0.004/gb/month, however there are drawbacks. Since this service is meant for data archiving, it is stored in what’s known as “cold storage” meaning your data is not accessible on-demand. Instead, you (or more likely, your application) will tell Glacier that you would like to download a certain file from your “vault” (your archive) and then 3-5 hours later (unless you pay for expedited), your application will receive a notification that your file is ready for it to download and it has 24 hours to do so.
Another catch is that even though it will let you download the entirety of your vault as fast as you can download it, it will cost you. To get your data back out of Glacier, it costs an additional $0.0025-$0.03/gb. That may not sound like a lot but when we get to talking about terabytes or petabytes of data, it adds up quick.
To sum up, I still think that Amazon Glacier is a great product if used correctly. For instance, if by law your organization is mandated to keep archives for x number of years and you know the chances of actually having to dig them up one day is slim? Glacier is perfect. Or as a last resort backup, meaning you have two or three other backups you can try to extract your data from before you have to dig into Glacier, then yeah.
0 notes
cwfrazier · 8 years ago
Text
Safari has Gotten Password Management Nailed
Some of you may know that I’m doing an experiment where I’m trying to use my iPad Pro as my primary computer for a year. One of the things that I was pleasantly surprised about is Safari’s built in password manager. Every time I go to create a new account somewhere on my iPad,
Safari has this nice button above the keyboard saying, “Suggest Password” and it generates a long, random password for me.
Now, I’m a big fan of Lastpass and 1 Password but they fail my “dad test”. My dad test being: is this easy enough that I could get my dad to use it? And unfortunately, I still don’t think that password managers have gotten mobile right. Nonetheless, I don’t know anybody that aren’t tech people that are willing to pay for an additional subscription (no matter how cheap) for something that makes their life harder.
But I really think that Apple did it right. They made it simple, free and secure. I sincerely hope that other browsers can follow!
0 notes
cwfrazier · 8 years ago
Text
Freshbooks: Stop Creating Invoices in Microsoft Word
I’ve always been surprised of how many people have a business on the side either consulting or graphic design and the like and still use Microsoft Word to create and send their invoices. Stop it! There’s a better way! It’s called Freshbooks. I’ve used it for most of my business career and it’s quite honestly the best tool for the job. With FreshBooks you can:
Create and send professional looking invoices
Have Freshbooks automatically generate and send recurring invoices
Track expenses and upload receipts
See when a client has or has not opened your invoice
Allow your clients to pay their invoices by credit card
Generate reports for profit and loss statements, expense reports, revenue by client, etc
I can’t say enough good things about Freshbooks, it is one of the best business tools that I’ve ever used! It starts at $15/month and you can try it free for 30 days. If you have to send out invoices every month, I highly recommend you check it out!
0 notes
cwfrazier · 8 years ago
Text
PHP Tip: Auto-Prepending your Custom Functions Globally
There are just some things that you like done your way. Ever since I’ve been developing with PHP, I’ve kept a file with my own custom developed functions that I know I may want to use again. Everything from a custom send email function that will report bounces back to me to simple one parameter file uploads to encryption functions and so on. I got into the habit about ten years ago to maintain this master function file and then on every web server that I have under my control, I set the flag in the php.ini file, auto_prepend_file to my master functions file that way all of my custom functions are available to me in every PHP file on that system without me having to do anything.
0 notes
cwfrazier · 8 years ago
Text
Virtualization Platforms
I’ve always been fascinated with virtualization. I remember back in my high school days, I had to duel boot my desktop with Windows and Linux because I wanted to use both operating systems. Then I got my hands on a copy of VMWare Workstation and it was huge! I could have virtual desktops for every operating system that I wanted!
Fast forward to 2017, most of the servers that I manage are virtualized. Early on when I was first just getting started with server virtualization, my go to platform was VMWare ESXi and it still is a great product. However, nowadays if I need a bare-metal virtualization server, I typically go with Citrix XenServer just because it feels more polished.
Surprisingly, I’m typically not a fan of the Windows Server family unless I really have to run something that requires Windows Server (Active Directory, Exchange, IIS, etc) but their Hyper-V hypervisor is really great! I use it whenever I can’t dedicate an entire server to a bare-metal virtualization server.
Then there’s personal use. It should come as no surprise that if I’m in need of a virtual machine for testing purposes, etc on my personal equipment, I just use Oracle’s VirtualBox. Other than because it’s free and open source, it’s cross platform so it runs on Windows, Mac and Linux.
So there you have it. I don’t really have a favorite virtualization platform. As with most things, I just choose the right tool for the job.
0 notes