Text
How to install PHP 7 on Ubuntu 14.04
In this short tutorial I’m going to explain step by step how to install PHP 7 (the latest stable version) on Ubuntu 14.04 (the latest LTS, or Long Term Support version). It also explains how to upgrade from PHP 5.x to PHP 7 on LEMP stack.
Installing PHP 7 on vanilla Ubuntu 14.04
Unfortunately PHP 7 is not available out of the box on 14.04, so we’ll have to add a PPA to be able to install it:
sudo apt-add-repository ppa:ondrej/php -y sudo apt-get update sudo apt-get install -y php7.0-common php7.0-cli php7.0-fpm php7.0-curl php7.0-sqlite3 php7.0-json php7.0-tidy php7.0-mysql
And you can confirm that we have PHP 7 running with:
php -v
That’s it, we’re done!
Updating LEMP stack to PHP 7
Now a bit more complex setup. We already have a LEMP stack (Linux, NGINX, MySQL, PHP) with PHP 5.x running on Ubuntu 14.04 and we want to update it to PHP 7. I’m going to use DigitalOcean’s “LEMP on 14.04” image.
The first part is the same as above, we’re going to add PPA and install a bunch of basic packages:
sudo apt-add-repository ppa:ondrej/php -y sudo apt-get update sudo apt-get install -y php7.0-common php7.0-cli php7.0-fpm php7.0-curl php7.0-sqlite3 php7.0-json php7.0-tidy php7.0-mysql
Next, let’s update NGINX settings to use the new version of PHP. Open file /etc/nginx/sites-available/default and replace
fastcgi_pass unix:/var/run/php5-fpm.sock;
with
fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;
and then restart NGINX
sudo service nginx restart
If you’ve made any changes to /etc/php5/fpm/php.ini or /etc/php5/cli/php.ini, you should reapply them in /etc/php/7.0/fpm/php.ini and /etc/php/7.0/cli/php.ini, respectively.
At this point your website should be served by PHP 7 via NGINX.
4 notes
·
View notes
Text
Investing for 20-somethings: What's in it for me?
Imagine, just for a moment, that you have a very rich and very generous Auntie. She decided to cover your fixed costs to allow you to focus on your career and passions, instead of worrying about job loss and how to pay bills on time. The two of you agreed that the Auntie would pay for your rent (£500/mo), your groceries (£250/mo) and your oh-so-essential connectivity needs ("kids these days!") like the internet, smartphone and Netflix pass (£50/mo), for a total of £800/mo. You don’t have to worry about paying the money back and you don’t have to worry about reliability of the money - it arrives in your bank account on the 1st of each month, like a clockwork.
How would you feel about your financial security?
Even if you lost your job or simply decided to sit on your butt all day every day, you would not starve to death or sleep on the street. Keeping your job would mean you can enjoy and excel at it a lot more than before - you work there because you want to, not because you have to, after all. If you’re in a job that you don’t enjoy, you’d be free to change it any time you want, but if you do want to stay there - all the proceeds could be spent on your passions, your hobbies - anything you like!
Sadly, most of us don’t have a generous Auntie to rely on...
...but don’t fret, there’s a way to simulate such a generous family member: investments! Smart investing in the world’s markets can provide enough income to pay for your fixed costs and more. According to the 4% rule (which we’re going to explore in detail in one of the future posts), you’d need only £240k to generate £800/mo, and not for 1, 5 or 20 years, but indefinitely!
But £240k is a lot of money!
Is it, really? You’re a capable, young adult, in your mid 20s. Even if you saved only £1000 per month in your savings account earning just enough to keep up with the inflation, you would have saved enough by the the age 45, full 20 years before the “official” retirement age! And “just enough to keep up with the inflation” rate of return is definitely not our goal here.
I want to see some results!
I’d like to show you what would happen to your money if you put aside a set amount (£1000) every month and kept the money in different places like savings account (0.1%, 1%), Cash ISA (2%), or Stocks&Shares ISA (5%, 7%).
Compared to just stuffing the money under the mattress (£240k) the money in the savings account barely grew (0.1% annual return: £255k, 1% annual return: £279k), the Cash ISA had a moderate success (2% annual return: £309k), but the Stocks&Shares ISA looks pretty good (5% annual return: £429k, 7% annual return: £538k) - it pretty much doubled!
Do you want to see what would happen if you kept saving £1000 monthly until 65?
Welcome to the millionaires' club! As you can see, the rate of return makes a huge difference on the end result. Now, let me show you how much would your money grow in different scenarios, if you invested a lump sum of £240k today and never added a penny again.
On one end of the scale (0.1%) your money would get only to £250k, however on the other end (7%) you'd have over £3,5M to your name. Not bad, given you just let that money grow on its own.
But I want to spend my money now, I deserve it
Do you remember the Auntie? If you spend all your money now, you won’t get to experience the bliss of not worrying about meeting your basic needs ever again.
I encourage you to spend some of your money on little indulgences and luxuries today, but do not spend every last penny! You do realise that with all the advancements in medicine happening today, we’re probably going to live for 100 years or more, don’t you? Give your future self a gift of financial security for many decades to come and put aside a set amount of money each month, starting now.
Let me tell you a story from Burton Malkiel, about…
Two brothers investing at different times
William and James are twins who are 65 years old. 45 years ago (at the end of the year that they turned 20), William started an IRA and put $2k in the account at the end of each year. After 20 years of contributions, William stopped making new deposits and left the accumulated contributions to compound. The fund returned 10% per year tax-free.
James started his own IRA when he reached the age of 40 (just after William quit) and added $2k per year for 25 years. James invested 25% more money in total than William. James also earned 10% tax-free. What are the values of William's and James's IRA funds today?
The answer might surprise you, but William, who invested less money, ended up with over $1.3M, while his brother James, who invested more, ended up with only $218k. This is the power of compounding in action. William's money simply had a lot more time to grow.
Moral of the story?
"Better late than never" is a lie. Start investing now and reap rewards for the rest of your life.
This is only the first post in the investment series I'm writing. If you want to be notified when the future posts go live, leave your email address below.
0 notes
Text
Creating a post-installation script for Ubuntu
I want to share with you today how to create your own bash script that can be used to bootstrap a fresh installation of Ubuntu (and very likely any other Linux distro after small modifications), bringing it very close to a state where you can just open your favourite apps and start working.
Back in the days of Windows XP I'd create a perfect setup of my machine and use software like Norton Ghost to create an image of the main partition. It had several advantages over my current approach:
no access to the internet required
fully automated process
every tiny detail saved
Here's the thing, though: with Windows I'd have to reinstall it every 2-4 weeks due to my heavy usage, installing and uninstalling of apps and general sluggishness of the system after a while. With Linux, on the other hand, I can go for months on the same install and I rarely run into problems that force me to reinstall the system. I can, however, put it on a different disk, partition, completely different machine or even use it for a different user, tweaking the username and adding/deleting sections of the script to match it to a new environment. Another great thing is that the script and all resources that it uses are very small compared to disk image. There is also no need to update any apps (or the system itself) after running the script as it already uses the newest versions available and runs system update.
The script is separated roughly into 3 parts:
install apps
configure apps
change system settings
with gsettings
with dconf
I've developed this script using Ubuntu 13.10 in Virtualbox 4.3 - you can create a snapshot of the system right after the basic installation (+ initial update & upgrade commands) and revert to it every time you want to run your code.
Part 1: Install apps
Some apps will require additional repositories, which should be added to the very top of your script. After that you can run update & upgrade, which will bring your fresh install up to speed with latest versions of everything installed by default.
The way I went about the apps was to go through my ~/.bash_history file and make a list of all apps that I'd like to have from the get go. I've added them all to one massive apt-get install.
There are two more apps that I want to install, that can't be installed via apt-get, as they are essentially PHAR files: Composer and Laravel. These are kept in /usr/local/bin/ . Don't forget to change their permissions and owner.
At the very end I have ubuntu-restricted-extras that requires interaction.
Part 2: Configure apps
There is generally several things to do, depending on app and your personal preference:
replace existing conf files / dotfiles with the ones inside data folder
append settings to existing conf files
add user to groups
copy scripts / program files into appropriate folders
Note: Pay attention to files that require root access to edit them. You won't be able to do the following:
sudo echo "alpha" > /etc/some/important/file sudo echo "bravo" >> /etc/some/other/important/file
The "sudo" applies only to "echo" in the examples above. Here's how you replace and append contents of these files:
echo "alpha" | sudo tee /etc/some/important/file echo "bravo" | sudo tee -a /etc/some/important/file
Note: Here's how to copy dotfiles with a wildcard (*):
shopt -s dotglob cp -ar ./data/dotfiles/* ~
Without the first line, the * wouldn't match files starting with "." .
Part 3: Change system settings
In this part we'll focus on two different tools: gsettings and dconf. I was planning to use only gsettings, but it turns out that some things just can't be changed with it.
Part 3.1: gsettings
My favourite way to make use of gsettings is to save all current settings from fresh install and diff them against my working machine.
On a fresh install within Virtualbox:
gsettings list-recursively > ~/original.txt
On my working machine:
gsettings list-recursively > ~/new.txt
It's a good idea to sort settings and get rid of duplicates before diffing the two files. Sublime Text can do that for you as well as diff the files. This way you will be able to see which settings actually changed since the fresh installation.
copy all settings that you want to preserve and prepend each line with gsettings set. Don't forget to add double quotes around arrays like ['spotify.desktop']
Part 3.2: dconf
I've used this tool to capture a few more settings that I couldn't change with gsettings. This time, the easiest way to do it is to run dconf on a fresh install in monitor mode with:
dconf watch /
And make desired changes manually. You should see paths popping on the screen with their new values. Prepend dconf write to the lines and values you want to set on a fresh machine and add them to your script.
One final note
sudo requires you to retype user password after 10 minutes, so I try to put all sudo commands before the rest of the commands, as much as possible.
Here's my current script:
#!/bin/bash # add repos sudo apt-add-repository -y "deb http://repository.spotify.com stable non-free" sudo add-apt-repository -y "deb http://linux.dropbox.com/ubuntu $(lsb_release -sc) main" sudo add-apt-repository -y "deb http://archive.canonical.com/ $(lsb_release -sc) partner" sudo add-apt-repository -y "deb http://dl.google.com/linux/chrome/deb/ stable main" sudo add-apt-repository -y "deb http://dl.google.com/linux/talkplugin/deb/ stable main" sudo add-apt-repository -y ppa:webupd8team/sublime-text-3 sudo add-apt-repository -y ppa:tuxpoldo/btsync sudo add-apt-repository -y ppa:freyja-dev/unity-tweak-tool-daily sudo add-apt-repository -y ppa:stefansundin/truecrypt sudo apt-key adv --keyserver pgp.mit.edu --recv-keys 5044912E sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 94558F59 sudo wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add - # basic update sudo apt-get -y --force-yes update sudo apt-get -y --force-yes upgrade # install apps sudo apt-get -y install \ libxss1 spotify-client sublime-text-installer git gitk gitg \ virtualbox virtualbox-guest-additions-iso filezilla dropbox \ skype btsync-user gimp p7zip p7zip-full p7zip-rar unity-tweak-tool \ indicator-multiload curl gparted dkms google-chrome-stable \ ubuntu-wallpapers* php5-cli php5-common php5-mcrypt php5-sqlite \ php5-curl php5-json phpunit mcrypt ssmtp mailutils mpack truecrypt\ nautilus-open-terminal google-talkplugin linux-headers-generic \ build-essential tp-smapi-dkms thinkfan moc # install Composer sudo curl -sS https://getcomposer.org/installer | php sudo mv composer.phar /usr/local/bin/composer sudo chmod 755 /usr/local/bin/composer # install Laravel sudo wget http://laravel.com/laravel.phar sudo mv laravel.phar /usr/local/bin/laravel sudo chmod 755 /usr/local/bin/laravel # Virtualbox sudo adduser x vboxusers # email sudo cp ./data/etc/ssmtp.conf /etc/ssmtp/ssmtp.conf sudo chmod 744 /etc/ssmtp/ssmtp.conf # x200 fan settings # http://hackmemory.wordpress.com/2012/07/19/lenovo-x200-tuning/ echo "tp_smapi" | sudo tee -a /etc/modules echo "thinkpad_acpi" | sudo tee -a /etc/modules echo "options thinkpad_acpi fan_control=1" | sudo tee /etc/modprobe.d/thinkpad_acpi.conf sudo cp ./data/etc/default/thinkfan /etc/default/thinkfan sudo cp ./data/etc/thinkfan.conf /etc/thinkfan.conf sudo chmod 744 /etc/default/thinkfan sudo chmod 744 /etc/thinkfan.conf # usb wifi + disable built in wifi // https://github.com/pvaret/rtl8192cu-fixes mkdir -p /tmp/bootstrap/usb-wifi-fix/ unzip -d /tmp/bootstrap/usb-wifi-fix/ ./data/usb-wifi-fix.zip sudo dkms add /tmp/bootstrap/usb-wifi-fix/ sudo dkms install 8192cu/1.8 sudo depmod -a sudo cp /tmp/bootstrap/usb-wifi-fix/blacklist-native-rtl8192.conf /etc/modprobe.d/ # swappiness cat ./data/etc/sysctl-append >> /etc/sysctl.conf # Sublime Text 3 mkdir ~/.config/sublime-text-3/ unzip -d ~/.config/sublime-text-3/ ./data/sublime-text-3.zip cp -ar ./data/sublime-text-3/* ~/.config/sublime-text-3/ # fonts mkdir ~/.fonts cp -ar ./data/fonts/* ~/.fonts/ # scripts mkdir ~/.scripts cp -ar ./data/scripts/* ~/.scripts/ chmod +x ~/.scripts/* # dotfiles shopt -s dotglob cp -a ./data/dotfiles/* ~ # autostart cp -a ./data/autostart/* ~/.config/autostart/ # Filezilla servers mkdir ~/.filezilla/ cp -a ./data/filezilla/sitemanager.xml ~/.filezilla/ # Terminal cp -a ./data/gconf/%gconf.xml ~/.gconf/apps/gnome-terminal/profiles/Default/ # folders rm -rf ~/Documents rm -rf ~/Public rm -rf ~/Templates rm -rf ~/Videos rm -rf ~/Music rm ~/examples.desktop mkdir ~/Development mkdir ~/BTSync # update system settings gsettings set com.canonical.indicator.power show-percentage true gsettings set com.canonical.indicator.sound interested-media-players "['spotify.desktop']" gsettings set com.canonical.indicator.sound preferred-media-players "['spotify.desktop']" gsettings set com.canonical.Unity form-factor 'Netbook' gsettings set com.canonical.Unity.Launcher favorites "['application://google-chrome.desktop', 'application://sublime-text.desktop', 'application://spotify.desktop', 'application://nautilus.desktop', 'application://gnome-control-center.desktop', 'application://gitg.desktop', 'application://gnome-terminal.desktop', 'unity://running-apps', 'unity://expo-icon', 'unity://devices']" gsettings set com.canonical.Unity.Lenses remote-content-search 'none' gsettings set com.canonical.Unity.Runner history "['/home/x/.scripts/screen_colour_correction.sh']" gsettings set com.ubuntu.update-notifier regular-auto-launch-interval 0 gsettings set de.mh21.indicator.multiload.general autostart true gsettings set de.mh21.indicator.multiload.general speed 500 gsettings set de.mh21.indicator.multiload.general width 75 gsettings set de.mh21.indicator.multiload.graphs.cpu enabled true gsettings set de.mh21.indicator.multiload.graphs.disk enabled true gsettings set de.mh21.indicator.multiload.graphs.load enabled true gsettings set de.mh21.indicator.multiload.graphs.mem enabled true gsettings set de.mh21.indicator.multiload.graphs.net enabled true gsettings set de.mh21.indicator.multiload.graphs.swap enabled false gsettings set org.freedesktop.ibus.general engines-order "['xkb:us::eng']" gsettings set org.freedesktop.ibus.general preload-engines "['xkb:us::eng']" gsettings set org.gnome.DejaDup backend 'file' gsettings set org.gnome.DejaDup delete-after 365 gsettings set org.gnome.DejaDup include-list "['/home/x/Development', '/home/x/Pictures']" gsettings set org.gnome.DejaDup periodic-period 1 gsettings set org.gnome.DejaDup welcomed true gsettings set org.gnome.desktop.a11y.magnifier mag-factor 13.0 gsettings set org.gnome.desktop.background picture-uri 'file:///usr/share/backgrounds/163_by_e4v.jpg' gsettings set org.gnome.desktop.default-applications.terminal exec 'gnome-terminal' gsettings set org.gnome.desktop.input-sources sources "[('xkb', 'us')]" gsettings set org.gnome.desktop.input-sources xkb-options "['lv3:ralt_switch', 'compose:rctrl']" gsettings set org.gnome.desktop.media-handling autorun-never true gsettings set org.gnome.desktop.privacy remember-recent-files false gsettings set org.gnome.desktop.screensaver lock-enabled false gsettings set org.gnome.desktop.screensaver ubuntu-lock-on-suspend false gsettings set org.gnome.gitg.preferences.commit.message right-margin-at 72 gsettings set org.gnome.gitg.preferences.commit.message show-right-margin true gsettings set org.gnome.gitg.preferences.diff external false gsettings set org.gnome.gitg.preferences.hidden sign-tag true gsettings set org.gnome.gitg.preferences.view.files blame-mode true gsettings set org.gnome.gitg.preferences.view.history collapse-inactive-lanes 2 gsettings set org.gnome.gitg.preferences.view.history collapse-inactive-lanes-active true gsettings set org.gnome.gitg.preferences.view.history search-filter false gsettings set org.gnome.gitg.preferences.view.history show-virtual-staged true gsettings set org.gnome.gitg.preferences.view.history show-virtual-stash true gsettings set org.gnome.gitg.preferences.view.history show-virtual-unstaged true gsettings set org.gnome.gitg.preferences.view.history topo-order false gsettings set org.gnome.gitg.preferences.view.main layout-vertical 'vertical' gsettings set org.gnome.nautilus.list-view default-zoom-level 'smaller' gsettings set org.gnome.nautilus.preferences executable-text-activation 'ask' gsettings set org.gnome.settings-daemon.plugins.media-keys terminal 'XF86Launch1' gsettings set org.gnome.settings-daemon.plugins.power critical-battery-action 'shutdown' gsettings set org.gnome.settings-daemon.plugins.power idle-dim false gsettings set org.gnome.settings-daemon.plugins.power lid-close-ac-action 'nothing' gsettings set org.gnome.settings-daemon.plugins.power lid-close-battery-action 'nothing' # update some more system settings dconf write /org/compiz/profiles/unity/plugins/unityshell/icon-size 32 dconf write /org/compiz/profiles/unity/plugins/core/vsize 1 dconf write /org/compiz/profiles/unity/plugins/core/hsize 5 dconf write /org/compiz/profiles/unity/plugins/opengl/texture-filter 2 dconf write /org/compiz/profiles/unity/plugins/unityshell/alt-tab-bias-viewport false # requires clicks sudo apt-get install -y ubuntu-restricted-extras # prompt for a reboot clear echo "" echo "====================" echo " TIME FOR A REBOOT! " echo "====================" echo ""
4 notes
·
View notes
Text
New Year's resolution: Inbox Zen
It's good to clean up your environment every now and then. Pretty much everyone uses email today and it seems like many of us don't really take care of this bit of our cyber space. Every morning I'd wake up to 10-30 emails that I'd select and archive in bulk, without even skimming through their contents, all scattered across different labels, occasionally separated by a few emails that I did actually want to read. Twitter notifications, offers from Amazon, random newsletters that I always plan to read "later", newsletters that I never explicitly signed up for. Many more would get filtered out of my inbox before I could even see them, archived automatically, sent to spam, deleted. Coming back from a holiday would mean digging through 100s of emails. Not good.
It's worth mentioning that I've disabled both Gmail tabs (Social, Promotions, etc.) and Priority Inbox as soon as they became available. I believe it's better to clean the mess instead of sweeping it under the rug.
Today I've spent a couple of hours tweaking and uncluttering my Gmail account. I never really had a problem getting down to inbox zero (or near zero), but over the years the amount of dirt has built up and the inbox needed a thorough cleaning. Here are the steps I took:
1. Unsubscribe from newsletters
I've unsubscribed from 20+ newsletters, most of which I'd archive as soon as I opened the email, without even reading the contents. The best way to go about it is to search for "unsubscribe" and go through the results one by one.
2. Unsubscribe from transactional emails
This one is still in progress, but I went through a number of emails and changed settings whenever I felt like I don't really need the notification. Great examples are Twitter, Facebook and Google+ emails (I get notifications on my phone anyway). Another example would be forums that allow you to receive notifications in real time or in bulk daily/weekly. I opted for daily emails for threads that I'm particularly interested right now, disabling notifications for the rest of them.
3. Reduce the number of custom labels
Many of the custom labels were not in use for years and contained 1-5 emails that I didn't need anymore. I got rid of most of them, leaving just 3, for travel related emails, phpconcole and communication with people close to me. No emails were deleted in this step, so all I got rid of was a bit of unneeded structure, down from ~30 to 3 - not bad!
4. Hide most of the labels
The only labels that I have visible by default are "Inbox" and "Starred". "Drafts" are visible only if there's anything to show (empty 99% of the time). All other labels, Categories and Circles are always hidden and I can get to them by clicking "More" below the two visible default labels.
5. Reduce the number of filters
Many of the filters became redundant after performing steps 1-3, so I got rid of them. Most of the filters that were left did exactly the same thing: Make sure that emails coming from address [email protected] never get sent to spam. It's really useful for automated emails from servers or my Raspberry Pi that often contain exactly the same copy and might be treated by Big G's robots as spam. You wouldn't want to miss emails saying that the server is down, would you? I might combine them all into one big filter in the future, but for now it all looks pretty good. Down from ~70 to 16.
6. Delete transactional emails
This step along with 7. allowed me to shrink my email archive by ~800MB. I've searched for transactional emails mainly from Twitter and Facebook along with a few more websites and got rid of all of them. There is absolutely no value in keeping these emails.
7. Delete emails with large attachments
I've started with search "larger:25m" which showed emails with attachments that are larger than 25MB and kept lowering it by 5MB, deleting emails that were no longer useful, in many cases with photos for old projects that just took space and I'd never need again.
8. Ongoing maintenance
I'm on the lookout for the unwanted emails that are still coming to my inbox and am getting rid of them for good, one by one, instead of just archiving them.
That's it, the job is (nearly) done. I'm looking forward to a bigger percentage of human-created emails in my inbox that were meant to reach me specifically.
Have tips on staying sane while working with email? Hit me up in the comments below.
1 note
·
View note
Text
Raspberry Pi [Part 3]: Amazon Glacier
Ok, it's time for some real task for our Raspberry Pi. Today we'll learn how to configure a command line client for Amazon Glacier and push GBs of data to the cloud. We'll also configure our Gmail account to send us an email when our RPi is done uploading data. Last, but not least, we will learn how to limit upload rate for RPi so that other devices can still use the internet.
I've recently used my RPi for this very task, pushing out 120GB of my photos and backups up to Glacier. It took quite a while on my not-so-good internet connection - I left it running for a couple of weeks. The great thing about RPi is that it's pretty much inaudible, even with HDD spinning 24/7 which makes it a perfect little server that can run under your desk. Let's get right to it!
1. Getting up to date
Let's log in and update system
ssh [email protected] sudo apt-get update && sudo apt-get upgrade
2. Install glacier-cmd
We will start by installing git and required python libraries and then install glacier-cmd
sudo apt-get install python-setuptools git git clone git://github.com/uskudnik/amazon-glacier-cmd-interface.git cd amazon-glacier-cmd-interface sudo python setup.py install
3. Configure it
Let's create a config file for glacier-cmd and fill it in
nano .glacier-cmd
Add the following, replacing your_access_key, your_secret_key and your_aws_region with correct values
[aws] access_key=your_access_key secret_key=your_secret_key [glacier] region=your_aws_region logfile=~/.glacier-cmd.log loglevel=INFO output=print
Now you should be able to see your vaults by executing
glacier-cmd lsvaults
Success!
4. New vault
Let's create a new vault for our photos
glacier-cmd mkvault "photos"
You should be able to see a new "photos" vault on the list
glacier-cmd lsvaults
5. Uploading a test file
Ok, now we can try to upload a file. I'm going to upload a file "ocr_pi.png" that I can see in my home directory
glacier-cmd upload --description "ocr_pi.png" photos "ocr_pi.png"
As you can see, I set the description to match the filename. By default, it would be set to a full path to the file, which is something that we don't want, thus a description parameter.
6. Uploading multiple files
Now we're going to create a script that will take care of uploading a bunch of files. Navigate to a folder that holds the files that you want to upload. In this example I'm going to upload zip archives. I trust you can figure out how to prepare your files for the upload. I've tried to keep every zip file below 500MB, making it easier to upload and also download data in the future, in case I need to access part of it.
Let's create a folder where we'll move uploaded files
mkdir uploaded
and a new script inside the folder with
nano upload.sh
and paste the following
find . -name "*.zip" | sort | while read file ; do echo "Uploading $(basename "$file") to Amazon Glacier." glacier-cmd upload --description "$(basename "$file")" photos "$file" && mv "$file" "uploaded" done
Now we can execute the script with
bash upload.sh
It should upload all files one by one, showing progress and rate as it does its thing.
7. Installing screen
All good and well, but we still can't leave it running on its own, uploading away all the files that we've prepared. We could, in theory, use cron job for that, but I personally like to be able to see the progress in real time whenever I want.
We're going to install screen, a little utility that will let us disconnect from ssh session, while it's still running and connect back to it at a later time, as if we never left.
sudo apt-get install screen
Now, let's start a session called simply "pi" within screen with
screen -S pi
You might notice that not much has changed. In fact, everything looks exactly the same. But let's see what screen will let us do. We will start top and disconnect, then we will try to reconnect and see if top is still running
top
Now press ctrl+a and d after that, you should see information similar to
[detached from 3068.pi]
We can now exit our ssh session with
exit
If everything went well, top is still running on our RPi even though we're disconnected. Let's see
ssh [email protected] screen -r pi
Boom, top is still running! As you can see, from system's perspective we've never logged out. It's going to be really useful for glacier-cmd. Just log into your "pi" screen session and execute our bash script as before
bash upload.sh
Now you can disconnect with ctrl+a followed by d and reconnect later to see how the script is doing. Neat, eh?
8. Email notification
I'd also like to be notified when RPi is done with uploading my files. It might take days (or weeks), depending on how much data you want to upload and how fast your internet connection is. The unfortunate truth is that upload speeds are almost always much worse than download.
Let's configure mail command. RPi will be able to email us about finished upload using a Gmail account.
sudo apt-get install ssmtp mailutils mpack sudo nano /etc/ssmtp/ssmtp.conf
And set (or add if they are not there) these options
mailhub=smtp.gmail.com:587 hostname=raspberrypi [email protected] AuthPass=myraspberrypipassword useSTARTTLS=YES
Now we should be able to send a test message
echo "email body" | mail -s "email subject" [email protected]
If everything worked fine, we can add email notification to our upload script
nano upload.sh
The whole script should look like this
find . -name "*.txt" | sort | while read file ; do echo "Uploading $(basename "$file") to Amazon Glacier." glacier-cmd upload --description "$(basename "$file")" photos "$file" && mv "$file" "uploaded" done echo "Selected files were uploaded successfully." | mail -s "Glacier uploads finished" [email protected]
Now, our script is going to email us when it's finished.
9. Throttling upload
The last part of this tutorial is throttling upload speed, so that RPi doesn't choke your internet connection
sudo apt-get install wondershaper sudo wondershaper wlan0 100000 400
The limits that we're setting are in kb, so the 400kb above equals 50kB. The first parameter is our network device, wlan0 for wifi, eth0 for wired connection. The second parameter is download speed, which you probably don't want to limit. The third parameter is upload speed.
Run the following to clear limits
sudo wondershaper clear wlan0
You might want to add these commands to cron, so that RPi can use as much upload speed as possible and cut it back during the day. Here are the commands that you might add to cron to limit speed at 10 am and remove this limitation at 1am, every day
0 10 * * * root sudo wondershaper wlan0 100000 400 0 1 * * * root sudo wondershaper clear wlan0
That's it, our RPi is ready to push tons of data to the cloud.
Other parts in this series
Part 1: Basic setup without any cables
Part 2: External drives, Samba and SFTP
Part 3: Amazon Glacier
4 notes
·
View notes
Text
Raspberry Pi [Part 2]: External drives, Samba and SFTP
In the second part of this series I want to show you how to connect external drives and configure SFTP and Samba for Raspberry Pi.
Note: You will need a powered USB hub if you plan to connect 2.5" external HDD (it requires more power than RPi can provide).
1. Update
Let's start with updating our RPi
sudo apt-get update && sudo apt-get upgrade
2. Setting up HDD
Now, the safest way to plug our external HDD is to shut RPi down with
sudo poweroff
unplug the power source, connect HDD and plug it back in. After it boots and you're able to ssh in, execute
sudo blkid
to see the list of disks. Mine looks like this:
/dev/mmcblk0p1: SEC_TYPE="msdos" LABEL="boot" UUID="2654-BFC0" TYPE="vfat" /dev/mmcblk0p2: UUID="548da502-ebde-45c0-9ab2-de5e2431ee0b" TYPE="ext4" /dev/sda1: LABEL="Data" UUID=3862A6DC65464A36 TYPE="ntfs"
The first two lines are Raspbian's partitions on the SD card. The third line describes my HDD. As you can see, it's UUID is "3862A6DC65464A36" and it's a NTFS drive. We will need this information shortly.
Now we're going to create a folder where all our drives (if you plan to use more than one) are going to be accessible and a folder that will represent our HDD
sudo mkdir /media/shares sudo mkdir /media/shares/data
The next step is to open fstab file
sudo nano /etc/fstab
and add the following configuration
UUID=3862A6DC65464A36 /media/shares/data auto uid=pi,gid=pi,noatime 0 0
As you can see, UUID matches my HDD. It's important in case you plug more drives and the path (e.g. /dev/sda1) changes.
Execute the following to mount the drive
sudo mount -a
Now you should be able to navigate to the drive using
cd /media/shares/data
And display its contents with
ls -lah
3. Handling NTFS
It's quite possible that your external HDD will be formatted with NTFS. Your RPi will be able to see the folders/files and read them, but it won't be able to make any changes. Let's fix it
sudo apt-get install ntfs-3g sudo mount -a
Let the RPi reboot and ssh back in. Now you should be able to create and delete a test folder
cd /media/shares/data mkdir test-folder rmdir test-folder
4. Samba
Samba is a cross-platform protocol that you can use to connect to your HDD plugged into RPi over wifi. We will have to set it up and open ports in iptables. Let's get right to it!
sudo apt-get install samba samba-common-bin
Now let's set a samba password for user pi
sudo smbpasswd -a pi
Great, now it's time to add locations that will be accessible via Samba
sudo nano /etc/samba/smb.conf
Uncomment the following line in "Authentication" section
security = user
Now scroll to the very bottom and add the following
[shares] comment = Raspberry Pi shares path = /media/shares valid users = @users force group = users create mask = 0660 directory mask = 0771 read only = no
and restart Samba
sudo service samba restart
Now we need to add Samba's ports to iptables and we should be able to connect to it from our computer over wifi! Let's do it.
sudo nano /etc/network/iptables
And add lines for ports 137, 138, 139 and 445, so that it looks like this
*filter :INPUT DROP [23:2584] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [1161:105847] -A INPUT -i lo -j ACCEPT -A INPUT -i eth0 -p tcp -m tcp --dport 22 -j ACCEPT -A INPUT -i wlan0 -p tcp -m tcp --dport 22 -j ACCEPT -A INPUT -i wlan0 -p tcp -m tcp --dport 137 -j ACCEPT -A INPUT -i wlan0 -p tcp -m tcp --dport 138 -j ACCEPT -A INPUT -i wlan0 -p tcp -m tcp --dport 139 -j ACCEPT -A INPUT -i wlan0 -p tcp -m tcp --dport 445 -j ACCEPT -A INPUT -m state --state RELATED,ESTABLISHED -j ACCEPT COMMIT
Let's pull it in
sudo iptables-restore /etc/network/iptables sudo iptables-save
And see if we can connect. Success!
5. SFTP
Now we're going to enable access to our data via SFTP. It piggybacks on SSH connection, so we don't have to open any additional ports.
sudo apt-get install vsftpd
(As pointed out in the comment below, there's no need to install anything to access the device via SFTP protocol.)
That's it! You should be able to connect using SFTP protocol and port 22, using username "pi" and your private SSH key as authentication method.
Note: You might have to convert yor SSH key into .ppk file if you use FileZilla.
Other parts in this series
Part 1: Basic setup without any cables
Part 2: External drives, Samba and SFTP
Part 3: Amazon Glacier
5 notes
·
View notes
Text
Raspberry Pi 2: Basic setup without any cables (headless)
Today I want to show you how to set up a headless Raspberry Pi 2 without any extra cables (HDMI or ethernet), screens, keyboards etc. You might have it all lying around, but you might as well be on the go with only your laptop and usb cable powering your Raspberry Pi.
You can still follow this guide in case you connect your RPi directly to the router, skipping step 3, where I set up wifi card.

I’ll assume you already have:
Raspberry Pi 2
SD card (8GB+)
power source (charger for your mobile phone will usually do)
compatible usb wifi adapter
New Bonus PDF: Get access to a free PDF version of this tutorial. Click here to download the PDF.
1. Getting Raspbian
The first step is to download Raspbian image that we’ll be working with. You can get it from here (I’m using version 2015-11-21). Extract it, it should be around 3.9GB.
2. Writing it to SD card
Instead of trying to describe every possible way of writing the image on the SD card, I’m going to point you to an excellent resource on this topic - elinux.org article. Once you’re done with it, we can move to the next step.
I personally use the Disks utility on Ubuntu. You can select your card from the list on the left, choose “Restore Disk Image” from the cog menu on the right, and select your img file.
3. Wifi settings
Mac users: Looks like you can’t access EXT4 partitions without fiddling with 3rd party software. The easiest way to go about it is to temporarily connect RPi to router with ethernet cable, ssh in (see below) and continue setting things up in /etc/wpa_supplicant/wpa_supplicant.conf to get the wifi running. Another option is to create a VirtualBox VM using Ubuntu, and mount the image there.
Don’t remove SD card from the reader on your computer. We’re going to set up the wifi interface, so that you can ssh into the box via wireless connection.
Open terminal and edit /etc/wpa_supplicant/wpa_supplicant.conf on the SD card (not on your machine).
Here’s how to open it with nano:
cd /path/to/your/sd/card/ sudo nano etc/wpa_supplicant/wpa_supplicant.conf
and add the following to the bottom of the file:
network={ ssid="your-network-ssid-name" psk="your-network-password" }
You can save file with “ctrl+x” followed by “y”.
Now, put the SD card into the RPi, plug the wifi in and power it up.
4. Test ssh access
The easiest way to find your Raspberry Pi’s IP address is to check your router’s admin panel. In my TP-LINK router admin panel I have to go to “DHCP” and then “DHCP Clients List”:
Another way to find the IP address is to use nmap tool. One of the following commands should display Raspberry Pi’s IP address if your IP address is 192.168.1.XXX or 192.168.0.XXX:
sudo nmap -sP 192.168.1.0/24 sudo nmap -sP 192.168.0.0/24 nmap -p 22 --open -sV 192.168.1.* nmap -p 22 --open -sV 192.168.0.*
Now that you know your Pi’s IP address, you should be able to ssh into it with:
ssh pi@[pi-ip-address]
Default password for user “pi” is “raspberry”.
5. raspi-config
Run:
sudo raspi-config
to expand filesystem, change user password and set timezone (in internationalisation options).
6. Password-less login
It’s time to secure it a bit. Log out executing:
exit
and copy your public ssh key into RPi with:
ssh-copy-id pi@[pi-ip-address]
Now you should be able to ssh into RPi without password:
ssh pi@[pi-ip-address]
Don’t have SSH key? No problem. Follow this guide from GitHub to create it.
7. sshd configuration
Now that we can ssh into RPi without password, it would be a good idea to disable password login.
sudo nano /etc/ssh/sshd_config
And change the following values:
#change it to no PermitRootLogin yes #uncomment and change it to no #PasswordAuthentication yes
From now on you will be able to ssh into your RPi only with your private SSH key. Nice!
8. Update
Let’s update RPi:
sudo apt-get update && sudo apt-get upgrade
It might take a while.
9. Watchdog
Now we’re going to install watchdog. Its purpose is to automatically restart RPi if it becomes unresponsive.
sudo apt-get install watchdog sudo modprobe bcm2708_wdog sudo nano /etc/modules
And at the bottom add:
bcm2708_wdog
Now let’s add watchdog to startup applications:
sudo update-rc.d watchdog defaults
and edit its config:
sudo nano /etc/watchdog.conf #uncomment the following: max-load-1 watchdog-device
Start watchdog with:
sudo service watchdog start
10. Firewall
We’re going to use UFW (Uncomplicated FireWall) to restrict access to our RPi:
sudo apt-get install ufw sudo ufw allow 22 sudo ufw enable
And we can see its status with:
sudo ufw status verbose
As you can see, we’re accepting incoming connections only on port 22.
11. fail2ban
Now we’re going to install fail2ban which will automatically ban IP addresses that are failing to get into our RPi too many times:
sudo apt-get install fail2ban sudo cp /etc/fail2ban/jail.conf /etc/fail2ban/jail.local
Restart fail2ban:
sudo service fail2ban restart
and check current bans with:
sudo iptables -L
Done!
That’s it, our RPi is set up and much more secure.
New Bonus PDF: Get access to a free PDF version of this tutorial. Click here to download the PDF.
Other parts in this series
Part 1: Basic setup without any cables
Part 2: External drives, Samba and SFTP
Part 3: Amazon Glacier
×

Get a free PDF version of this tutorial: Basic setup of Raspberry Pi 2 without any cables (headless)
.lead-magnet-box { background-color: #fef5c4 !important; border: 1px solid #fadf98 !important; clear: both !important; margin-bottom: 18px !important; overflow: hidden !important; padding: 13px !important; box-shadow: 0 0 10px rgba(0,0,0,0.1) !important; } .lead-magnet-box a { color: #cd3e01 !important; border-bottom: 1px dotted !important; text-decoration: none !important; } .lead-magnet-background { display: none; position: fixed !important; left: 0 !important; top: 0 !important; width: 100% !important; height: 100% !important; background-color: rgba(0,0,0,0.4) !important; } .lead-magnet { display: none; position: fixed !important; width: 840px !important; height: 540px !important; left: 50% !important; top: 45% !important; margin-left: -420px !important; margin-top: -270px !important; background: #fff !important; box-shadow: 0 0 20px 2px rgba(0,0,0,0.1) !important; border-radius: 4px !important; padding: 20px !important; text-align: center !important; } .lead-magnet-close { display: block !important; font-size: 24px !important; position: absolute !important; top: -8px !important; right: -8px !important; width: 32px !important; height: 32px !important; color: white !important; line-height: 26px !important; text-align: center !important; text-decoration: none !important; vertical-align: baseline !important; cursor: pointer !important; background: black !important; border: 4px solid white !important; border-radius: 24px !important; } .lead-magnet img { margin-bottom: 10px !important; width: 300px !important; height: 225px !important; } .lead-magnet h1 { line-height: 110% !important; font-size: 32px !important; margin-bottom: 30px !important; color: #555 !important; font-weight: bold !important; } .lead-magnet input[type="text"], .lead-magnet input[type="email"] { padding: 10px !important; font-size: 20px !important; width: 300px !important; margin: 0 2px !important; } .lead-magnet-submit { background-color: #d91146 !important; color: #fff !important; padding: 15px 40px !important; font-size: 24px !important; border: 0 !important; border-radius: 2px !important; cursor: pointer !important; } $(document).ready(function () { $('.ajax-lead-magnet-trigger').click(function (e) { e.preventDefault(); $('.ajax-lead-magnet-background').show(); $('.ajax-lead-magnet').show(); }); $('.ajax-lead-magnet-background, .ajax-lead-magnet-close').click(function (e) { e.preventDefault(); $('.ajax-lead-magnet-background').hide(); $('.ajax-lead-magnet').hide(); }); $('.ajax-lead-magnet-submit').click(function () { $('.ajax-lead-magnet-background').hide(); $('.ajax-lead-magnet').hide(); }); });
#raspberry pi#wifi#setup#cables#ethernet#headless#server#iptables#fail2ban#watchdog#ssh#raspbian#how-to#linux#raspberry pi headless
23 notes
·
View notes
Text
Scaling down
Minimalism has found its way into my pockets. I'm watching it carefully, trying to figure out what will be its next step. A lot has changed recently and I can't wait to see what's next!
Wallet
Somewhere in 2011, I got my wallet scaled down from a sizable 1" brick into a nice, tiny pack that forces me to keep it tidy. There is no space for coins, excess of banknotes (1 max) or anything else for that matter. Every new item in my wallet replaces one of the existing items, so each thing has to be really useful to land in there. The wallet is sleek, barely visible in my pocket and so comfortable that once in a while I have to check if it's still there (is it really a drawback?). Paper money is so yesterday.
Cameras
As a photo geek I used to own a DSLR, SLR, a party-camera (HP 320 FTW!) a couple of lenses and all sorts of lighting equipment, but as years went by, I lost more and more interest, slowly getting rid of my gear and considering replacing bulky DSLR with something smaller. They say that the best camera is the one that is always with you, after all, so I decided to get myself a Canon G12. Nice and much smaller, almost pocketable. A few months later I've finally acknowledged that I barely use it, because it was still pain to carry it around. Smaller doesn't mean small. Separate charger and pretty slow reaction time didn't help. I've decided that it's time for a drastic change, I wanted something REALLY small.
Tablet + MiFi
I got myself a Nexus 7 back in the day when it still shipped with 8GB of memory. It was (and still is!) a very decent device. Prompt updates from Google are always nice and I consider a 7" device to be the best form factor out of all touch devices I had a chance to get my hands on so far. Several of my blog posts were written on it and countless hours were spent browsing internet / reading books / chatting with people. I've even used it as a sat nav for cycling due to a pretty chunky 4400mAh battery that lasts considerably longer than my phone. The lack of built in GSM support was a PITA that forced me to buy a MiFi device to provide internet on the go for the tablet. Another device to carry around, charge, remember to take with me. Not good. Also the size of the tablet was starting to be an issue for me - small enough to put into my jacket pocket while exploring city (try to do that with iPad), but not small enough to keep in my jeans pocket while in the pub, etc. Time for a drastic change.
Phone
For the past 2+ years I carried ZTE Blade with a custom rom and overclocked processor from the day 1. It is a really pleasant phone that can be bought for £60-£70 these days. Decent screen (480x800, 3.5") and not that terrible specs, but nothing amazing, especially in 2013. It did the job, especially paired with the tablet. The battery lasted for 3-4 days, mainly because I tried to use tablet for all multimedia tasks. Tablet+Mifi+phone combo worked well most of the time, provided quite pleasant experience (both tablet and phone running on the newest Android 4.2.2), but you can't always have all these devices with you. Also, the battery started acting weird since I upgraded from 2.3.7 to 4.2.2, so I had to use either an old SonyEricsson phone or a tablet as a backup alarm clock. Not good. Time for a drastic change.
Looking for a solution
I looked at all the issues described above and came to one conclusion: It's time to get rid of all that big, bulky, inefficient stuff and replace it with something much smaller, yet better quality. I've short listed my requirements:
has to be a phone - I want to have one device that Does It All(tm)
very good camera - I'm getting rid of my Canon camera, so I need a replacement
good specs - the phone waits for me and not the opposite
good screen - I want to be able to read a book on it on the go, perhaps watch a film on a plane
very good battery life / extended battery option - what's good a powerful phone if you can't use it?
good price - I don't want to spend a fortune on it
I found only one device that ticked all the boxes, a phone that I customised (as with every electronic piece of equipment that I own), that runs beautifully and meets all my needs - Samsung Galaxy S3.
One device for all digital needs
I got it used from eBay for a decent price and bought a MASSIVE 7000mAh battery for it. I've replaced the stock software with nightly CyanogenMod 10.1 to get experience as close to Nexus device as possible. It's a beast. Snappy, pretty, taking amazing photos (for a phone) and lasts forever on a single charge. Quoting classic, "on a scale 1-10 it's a definite win". I'm still getting used to the size (change from 3.5" to 4.8" is not a small one), but I'm sure it's not going to be a problem after a few weeks.
All in all, I retired my camera, tablet, MiFi and two phones and replaced it all with just a single device. WOW.
Another options?
Samsung S4 has better specs and available 7500mAh battery, but is much more expensive. Samsung S4 mini or its successor could perhaps replace my current setup. I'd probably trade smaller overall size for smaller screen (obviously) if the rest of specs stayed decent. There's a bunch of phones that match my criteria with the exception of battery life, which is a shame. I don't understand phone manufacturers creating devices that barely last 1 day of moderate usage. I can't be the only person who wants to be sure that no matter what I do during the day, I'm still going to have some battery juice left before going to bed!
The future is bright
Currently I'm waiting for Google Wallet to become available in UK to test it and perhaps get rid of another thing to carry around. I doubt I'd stop carrying my wallet completely (it's not only credit cards!), but who knows.
Another, more interesting application would be to replace my laptop with my phone, using this dock. It features HDMI port, 3 USB ports, AUX out and micro USB for power. I use external screen most of the time anyway and I'm sure that S3's specs would be good enough to run tools that I use for web development. Canonical is working on an operating system that combines Android and Ubuntu Desktop. Can't wait to see it happen!
Do you feel like your devices own you? Did you recently got rid of a bunch of electronics? Show your story in comments below!
2 notes
·
View notes
Text
All I want for Christmas is you, battery
There is an alarming trend in technology world, especially in mobile phones department. In the past your phone would last for a week or more on a single charge. Phones were mobile. You could unplug it on Friday, go hiking for a weekend and plug it back when you came back home after an adventure.
That's something unheard of for modern phones. Most of them will barely last 1 day. Technology went forward, batteries got better, but companies like HTC and Apple decided to slim phones down, cutting out as much fat (read: battery) as possible.
So many powerful devices, so little juice to keep them running.
We ended up with powerful devices that turn into bricks by 6pm. If you use your phone extensively - Facebook, Twitter, browsing internet, maps, GPS, not to mention old school texts and calls - you will have much more trouble to spend entire day without plugging your device into mains to charge it up.
There is something that drives me mad every time I think about it: In the past pretty much every laptop and every mobile phone had user replaceable battery. If I needed 8h battery time on my laptop and knew that a standard battery would last only for 4h, I could buy another one and replace it when needed. If I went for a hike with my GPS-enabled phone and expected to use it a lot, I could take one or two extra batteries and make sure that I won't get lost.
Now, more and more laptop and phone manufacturers opt out of user replaceable battery in exchange for a sleek, unibody designs. They do look good, but if you are out and about, you won't be able to use them for too long.
There are exceptions
I was really pleased to see the new Macbook Air that offers 12h battery life, but it's still nowhere near old Thinkpad X series (20+ hours). Another notable device is Samsung S3 phone, one of the very few modern, powerful phones that still has replaceable battery. I've recently purchased a Zerolemon 7000mAh extended battery, which kept my phone running for 3.5 DAYS after first charge. That's with a fair bit of usage, including 12h screen time.
Am I the only person who wants to have his gadgets running for days on a single charge?
4 notes
·
View notes
Text
CodeIgniter timing out after 300s ? Here's a solution
Hey CodeIgniter developers, here's another thing to look out for when developing with our favourite framework. We ran into this issue here at GatherContent a little while ago, trying to figure out why PHP scripts triggered from CLI would always time out after 300s, even though every possible setting on the server was set to much higher values.
Turns out CodeIgniter overwrites time limit that you set on your server with a "liberal" (according to CI team) limit of 300s. Take a look:
(system/CodeIgniter.php, lines 100-108)
The problem is that 300s is way too long for a front end parts of your app (no one's going to wait that long for a page to load), and way too short for back end parts of your app (scripts generating massive PDF's for example). It might be ok for most people most of the time, but it might bite you badly one day and you'll waste your time trying to figure out why the heck your code times out.
The best fix is probably to comment out entire section and let your server decide how long to run scripts for. No more unexpected behaviour.
And if you read that far, check out another unexpected CodeIgniter issue that I found a while ago.
1 note
·
View note
Text
Yellowish tint on your Nexus 7 / Samsung S3? Here's how to fix it
I've been using my nexus 7 for nearly a year now and never thought that there would be a simple solution to a problem that every Nexus 7 owner faces (literally) - yellow tint on the screen. We all know it, we all saw it and there's nothing we can do about it, right? Wrong.
I've found an app that does an amazing job in terms of fixing (more like hacking) colours on my precious. It's called Screen Adjuster. The trick is very simple: the app displays a layer above everything else on the screen, which you can turn a little bit blue, which in turns offsets the yellow of the screen itself, making it much more "white".
I've set blue to +13, left all other sliders alone and set the app to autorun on startup and hide status bar icon - as a result I have a tablet that looks good and is not littered by any unnecessary notifications/icons/whatever.
Enjoy!
11 notes
·
View notes
Text
Production/development switch for your codebase
tl;dr: Add ".production" file to the root folder of your codebase on production servers, add ".development" file to the root folder of your codebase on development servers (both files empty, only name is important), ignore them globally in your git repo, ignore them locally in your FTP settings or whatever you're using to push changes up to your servers (I'm using ST2+FTP plugin for dev server) and add the following somewhere at the top of your index.php file:
Now, you can put all settings for both production and development environments in your configuration files and choose the right one based on your ENVIRONMENT constant! Sweet, one less thing to worry about!
Long version
How do you deal with differences in your codebase between production environment and development environment?
In my case, I used to have my codebase in git set to production (disabled error displaying, live base_url, live database, all functionality enabled) and using git's "ignore locally" feature several files changed so that error displaying is enabled, base_url set to dev server, dev database details, emails redirected to phpconsole.
What's wrong with that approach? Quite a few things, actually. What if a new developer sets up his account on dev server without changing any config files and starts running his code against live database? Well, we can change the repo to point to dev db by default. But what will happen if someone deploy code to one of production servers and forget to change values to point to the live db? Oopsie, some of the clients are hitting our dev database!
One day I was wondering how to make it all work automatically and remembered that Beanstalk uses ".revision" file in root folder of your codebase to track which revision is deployed to your servers.
I thought "brilliant!" and decided to use similar approach:
The snippet of code from tl;dr above sets ENVIRONMENT constant that can be used to change your application's behaviour. If none of environment files is present, the page will exit with an information about what happened. This way we're mitigating risk of running wrong version of the code,while keeping convenience of single codebase.
Of course, you're not limited to only 2 options, you can easily add another one, e.g. ".staging".
Let me know what you think about this approach in the comments below.
You can also read: How to understand half of Harry Potter book in any given language How to increase productivity per square inch of your screen Logging in with QR codes
1 note
·
View note
Text
Super fast FTP upload (if you have thousands of files)
Update: Ok, reviewing this post a year later, I'm pretty sure you can totally ignore the "solution" below. The best way to go about syncing your folder to a remote server (if you have SSH access) is to use rsync like so:
rsync -az ~/path/to/source remoteuser@remoteserver:/path/to/destination
Original post below:
I vaguely remember one of my ex-colleagues moaning about WordPress taking oh so long to upload via FTP. I've been reminded of this problem a couple of days ago, moving several of my sites to a new hosting provider. Pretty much every folder contained over 1000 files and the process took forever.
I came up with a solution that probably to some of you will be totally obvious and familiar, but I'm sure there's plenty of people who can benefit from this post.
Requirements:
SSH access to your server
Linux or Mac on your machine - with Windows there might be slightly more fiddling
Steps:
Open terminal and navigate to a folder that contains your page files
Run "zip -r code.zip *"
Upload created code.zip file in your preferred way (I use Sublime Text 2)
SSH into your server and navigate to a folder where you uploaded your code.zip
Run "unzip code.zip"
Delete both code.zip files. You can use "rm code.zip"
The biggest win is the fact that you have to upload only ONE file, which is much much faster than uploading thousands of separate files.
Want real numbers?
Ok, let's try it with the latest version of WordPress. Zipped file straight from their server is 5.2MB and (on my worse than usual connection) uploads in 1:26. Unzipped folder contains 1026 files, is 11.8MB and uploads in 21:06 (!)
The difference is less significant on better internet connection but more significant when the size of your project grows up to a few thousands files and above.
Hey, all my WordPress files are in a subfolder! What did you do?!
Ok, ok, calm down, we can fix it real quick, moving all WP files one folder up:
Run "cd wordpress/"
Run "mv * .."
The second command moves all files from current folder to one folder up ("..").
You can also read: How to understand half of Harry Potter book in any given language How to increase productivity per square inch of your screen Logging in with QR codes
2 notes
·
View notes
Text
How much code did you really write? (+ source code)
Have you ever thought how much space is taken by characters (spaces and tabs) preceding your code, just to indent it for readability? I bet when you look at the code in most of the editors, you don't really notice it. It's there and it lets your creation breathe. But take a look at the following screenshot from Sublime Text 2 (awesome editor by the way):
Notice, how many spaces are there! For bigger projects it can be mind boggling. I started wondering, how much spaces do I have in my code that is not really a code? How much spaces do I have on phpconsole page? To answer that question I wrote a simple script that calculates number of spaces preceding code within selected folder. It takes into account *.php, *.html and *.htm files. I thought about adding *.css and *.js but many of them are minimised these days, so there is not much sense in adding them.
Here's the code:
My results:
phpconsole - landing page for my project that I encourage you to check out: 17.82% of selected folder's files (3774 spaces, 21181 total characters).
legierski.net - my homepage: 7.64% of selected folder's files (400 spaces, 5236 total characters).
CodeIgniter 2.1.3 - my PHP framework of choice: 3.70% of selected folder's files (92426 spaces, 2497844 total characters).
In CodeIgniter's case it's over 90KB of indentation!
How does it stack up against your code? Let me know in the comments below!
You can also read: How to understand half of Harry Potter book in any given language How to increase productivity per square inch of your screen Logging in with QR codes
2 notes
·
View notes
Text
4 weeks into my 'Half hour' productivity hack
Yes, 4 weeks, and guess what, it still works! It's not one of those productivity hacks where you try to be super efficient and do more in the same time (that doesn't work for most people). Instead it is a really simple system and focuses on exploiting our habits.
Ok, so what is it all about?
A lot of us wakes up early 5 days a week, works in the office for a few hours, comes back home and after a dinner negotiates with the inner self "I should be doing something creative. Why am I watching TV? This month I wanted to work on project X, but man, I'm so tired! I think I'll let myself off today and work hard tomorrow evening instead". And that happens pretty much every day.
I decided to break this bad habit by exploiting my most productive environment and working on my personal projects right after finishing work, for half an hour each weekday. No snacks, no breaks. Just get back to work!
Keeping creative work in one block makes things much easier and half an hour (for starters) is not that long. After a while I may extend that period, but I believe now, at the beginning, the most important thing is to reinforce the habit.
The hack works exceptionally well for me and I feel much better being able to work towards self-development every day without noticeable effort. I can also enjoy the rest of the evening, knowing that I already made some progress today.
I think it's also important to give yourself a weekend break from that routine. Enjoy your free time, do what you love, sleep longer if you need it, code/design/study languages if you really want to and then get back to work on Monday.
Do you have similar experience with productivity hacking? Did you try it yourself? Share in the comments below.
You can also read: How to understand half of Harry Potter book in any given language How to increase productivity per square inch of your screen Logging in with QR codes
3 notes
·
View notes
Text
CodeIgniter and its Download Helper - be careful!
Beware! Just the other day I've experienced an unusual bug, that turned out to be a feature of CodeIgniter's download helper. Downloading files works as expected in 99% of cases, but the remaining 1% are the files without any extension. If you try to download a file without extension, you'll see a blank screen, no information whatsoever about what went wrong, no error logs to work with and of course no file that you wanted to download in the first place.
Luckily, the issue is really obvious, the fix comes in two flavours and they are both very easy to implement.
The problem is the helper's design. Take a look at the source code. In the lines 51-54 there's an if statement that checks if the filename contains a dot = filename has an extension. If yes, continue, if not return false. The script terminates without any explanation, instead of at least notifying us in error logs (or user, by displaying a message) that there's something wrong.
I can see 2 reasonably clean fixes and 1 hack to get rid of that problem:
Fix no.1 (recommended) Create a file called MY_download_helper.php in application/helpers/ folder. Copy the contents of /system/helpers/download_helper.php and delete lines 51-54. The good thing about this solution is that you don't have to refactor any other code - helper will continue processing request and send application/octet-stream header (because of lack of recognised extension). You just have to remember that in the future the original helper can change and you should update your version accordingly.
Fix no.2 This fix is more future-proof, but requires a bit more work and changes the original filename. Create a file called MY_download_helper.php in application/helpers/ folder and paste the following code:
Drawbacks: you have to change force_download() to force_download2() all over the app and downloaded file will have different extension compared to the original file.
Quick hack (not recommended) Go to file /system/helpers/download_helper.php and simply delete lines 51-54. It's fast and it works. Why is it not recommended, then? It will break as soon as you update CI to the latest version and the helper will get overwritten by a newer version.
I hope this post will be helpful to me and others desperately googling something like "codeigniter file download problem".
If you know a better solution to this problem - please, let me know and I'll post it here!
You can also read: How to understand half of Harry Potter book in any given language How to increase productivity per square inch of your screen Logging in with QR codes
2 notes
·
View notes
Text
Stop Validating Email Addresses With Your Complex Regex - The PHP Way
In his recent article, David Celis argues that we should stop validating addresses with regular expressions, saying, that it's waste of time and effort. Apparently instead we should just rely on activation email being sent to the address specified by user. Is that really the best way?
Looks like it's a great moment to share my recent discovery:
filter_var($email, FILTER_VALIDATE_EMAIL);
This simple PHP function will return true or false, based on specified email. You can read more here. I know you all love PHP, it's so popular after all!
No regex, no mess.
Simple, isn't it? :)
6 notes
·
View notes