Don't wanna be here? Send us removal request.
Text
Hardware Acquisition Hacks
Here's a collection of some hardware acquisition hacks I've been using over the last few years. By no means are these original ideas–they're quite well-known within their own niches. But all the same, they don't seem to be very widespread among the other techie folks I talk to.
The goal of these hacks is to construct a computing setup that is inexpensive, upgradable and repairable. This makes them tinkerable. It's easier (and more fun!) to play around and risk breaking old cheap hardware than that latest $2k shiny laptop. Computing and storage have made such vast leaps that for everyone but a tiny fraction of users anything made within the last decade is more than enough to be their daily computing driver. This is not aimed at those slinging around 4k video and needing color-accurate displays. It is for web browsing (isn't every computer basically a Chromebook with some small extras at this point?), watching videos, and run-of-the-mill productivity stuff like word processing and spreadsheets.
Laptops
Old Thinkpads FTW, 'nuff said.
They're built like tanks so you don't have to worry about dropping them once in a while. Back lid pops off easy so you can upgrade/replace RAM and disk (and even the CPU on most of them). The Thinkpad keyboard is miles ahead of any other laptop keyboard, even the latest ones. There are tons of videos on YouTube walking through how to upgrade every model out there. People have even swapped out the display and trackpad to match their liking.
You can find old Thinkpads in good condition for low 100s of dollars on eBay. If you go deep into the Old Thinkpad rabbit hole, you can even find plenty of non-functioning ones on eBay that are ~10s of dollars that can be scavenged for parts.
Some notable YouTube channels that focus on old Thinkpads (comparing models, reviewing them, upgrading them): Laptop Retrospective, Sebi's Random Tech.
Desktop
I recently discovered the 1-litre (ultra-small) form factor via ServeTheHome's Project TinyMiniMicro. They're pretty much desktop machines, but in a much smaller form factor, while still retaining upgradability.
I haven't looked at every brand in detail, but from a quick comparison it looks like Lenovo carried some of their Thinkpad philosophy into their Tiny line, in that they're easier to open up and upgrade than the HPs and Dells. I recently bought an M710Q Tiny second-hand, and it was super-easy to add RAM and SSD. You can get in with one screw, and the SSD and RAM slots are tool-less, so no more messing with screws in there.
You can find a pretty decent used Tiny/Mini/Micro for around $200-$300 on eBay, and for that much you'll typicall get a 6th or 7th gen Intel Core processor, 8 GB RAM and around 200GB SSD. Not all models have WiFi though, so watch out for that.
Or you could just dock your old Thinkpad.
Displays
I personally find that the element of my computing setup that most affects my productivity is the display. More specifically, having a large display. Turns out there's empirical data to support that. There's a tradeoff there between screen real estate and ergonomics so that you don't end craning your neck from one side of the display to the other. I've found that at typical desk viewing distances (~3ft), 40-43 inches works pretty well. This will be different for everyone so you should experiment and find what works best for you.
But the hack is that 4k TVs (not computer monitors) of size 39-43 inches are around $200 these days, which is less than half of what a comparable monitor goes for. Around Black Friday you can easily find deals on the former for as low as $150. Yes, you can probably tell the difference if you pixel peep, but you'll have to squint pretty hard.
Note that the default settings on these are optimized for TV viewing, not use as a computer monitor. Typically, you'll need to turn down the saturation to something in the middle of the range, and the sharpness all the way down. Some of the recent makes and models seem to have caught on to the fact that they'll get used as monitors and have settings bundled into "modes" for TV, PC and gaming.
May your hardware dollars go far and bring you joy!
2 notes
·
View notes
Text
Automatically edit out silence (dead air) from videos
Over on my YouTube channel, I make talking head videos, covering various computer science topics, mostly research papers. One of the most time-consuming yet mindnumbingly boring parts of that process is finding out all the dead air in the original video and editing it out. That's got to be automatable, right? I'd been looking for a while but couldn't find a good solution.
Massive props to Donald Feury who created a script that combines ffmpeg and MoviePy to do just that.
I've been using that to edit the last couple of videos I created, and also been making minor improvements to it:
the original script depends on the "silencedetect" filter in ffmpeg, which in my experience can sometimes leave long stretches of silence undetected, forcing me to edit the video manually and then render it again
the original script uses a two-step process: first run the ffmpeg silencedetect filter, then use the output of that with MoviePy to render the final video
I've addressed those issues in this script which does the silence-detection using MoviePy's audio APIs, eliminating the dependency on ffmpeg, while also making the script self-contained such that it does the job in one invocation. It has proven to be much more reliable at detecting all silent periods in the original video.
0 notes
Text
Remapping arbitrary keys in Mac OS X
Just thought I'd put this tip out in the Internet...
So I recently got a Thinkpad Trackpoint II keyboard, which I like to use with my MacBook when docked to an external monitor. Yes, I realize that's mixed up.
But it happens to have a "PrtSc" key right on the space-bar row that is perfectly placed to be the "Cmd" key. Mac OS X, in its Keyboard Preferences, lets you remap some standard keys (Caps Lock to Ctrl, for example), but not PrtSc.
Following this documentation, this is the command I ended up with to map PrtSc to Cmd:
hidutil property --set '{"UserKeyMapping":[{"HIDKeyboardModifierMappingSrc":0x700000046,"HIDKeyboardModifierMappingDst":0x7000000E7}]}'
The keycode for PrtSc is 0x46, and for "Right GUI" aka Cmd is 0xE7, and you have to OR them into 0x700000000.
0 notes
Text
How Unix Won
Unix has won in every conceivable way. And in true mythic style, it contains the seeds of its own eclipse. This is my subjective historical narrative of how that happened.
I'm using the name "Unix" to include the entire family of operating systems descended from it, or that have been heavily influenced by it. That includes Linux, SunOS, Solaris, BSD, Mac OS X, and many, many others.
Both major mobile OSs, Android and iOS, have Unix roots. Their billions of users dwarf those using clunky things like laptops and desktops, but even there, Windows is only the non-Unix viable OS. Almost everything running server-side in giant datacenters is Linux.
How did Unix win?
It was built by programmers, for programmers. If you read the early papers describing Unix, you will see how the key abstractions (hierarchical filesystems, permissions, processes, interactive shells, pipes) have lasted conceptually unsullied for decades. That could only have happened if it exerted such a force field over geek minds that they propagated it.
The majority of it was written not in assembly language, but in C, a higher-level language, hence making it portable. This enabled relatively easy ports to a wide variety of hardware.
A freak historical accident. AT&T Labs, where it was developed, was forbidden under its anti-trust settlement from commercializing products unrelated to its core telecom business. Hence, Unix was licensed very cheaply to universities, including UC Berkeley, which subsequently built one of the more influential branches of the Unix family tree—BSD. Apparently, AT&T classified Unix as industrial waste for tax purposes when licensing it!
Unix spread through academia, and those students spread it through corporations after they graduated. It's a strategy that has been used by every major tech company, only in this case it was organic.
And then came the Internet, and the whole universe of daemons, tools, protocols, and utilities that undergirded it was built natively on Unix. That wasn't a huge surprise because a lot of the people that built the Net were Unix natives. BSD open-sourced its TCP/IP stack, kicking off its wide adoption outside the military.
By the late 90s and early 2000s, Linux started taking over the server-side. It has become the ultimate virtuous cycle in open source. When picking a new kernel one is virtually forced to go with Linux because of the huge community and massive engineering that has gone into solidifying it. That's probably why Android picked Linux even though it was running on the other end of the hardware spectrum.
So how and why would we move past Unix?
Unix, which got standardized into the POSIX spec, has accreted a tremendous amount of complexity over the decades. The POSIX spec is 3000 pages long. Linux now has nearly 400 system calls. What started as a clean, pure, minimal and elegant set of system abstractions has become a complex beast.
The very idea of the OS providing general-purpose abstractions with wide applicability is being challenged. When Unix was created, IO was orders of magnitude slower than CPU. There were enough CPU cycles to burn to provide these high-level abstractions, like a complex hierarchical filesystem. Now IO can easily saturate CPU. At least on the server side, folks just want to get the fastest performance out of their hardware. That's leading to the rise of frameworks like SPDK and DPDK that bypass heavy OS abstractions for storage and networking in favor of applications directly accessing the raw hardware, and rolling whatever abstraction they do need on their own.
The entire ecosystem in which an OS exists is changing. Application-level experiences are hermetic and make the underlying OS more or less irrelevant. Just look at Android or iOS or ChromeOS. Programming to a virtual machine (like the Java VM) makes programmers much more invested in their PL/runtime than the OS.
The above two points raise an interesting question: what does a modern operating system want? Look, for example, at Fuchsia, which is going in the direction of a microkernel with capabilities, that pushes most drivers and OS services out to userspace. These are ideas that have been floating in academic OS research for decades, but could never gain real mainstream acceptance because of the high barrier to making a viable real-world OS. This is the effect Rob Pike was talking about in his "Systems Research is Irrelevant" talk. But the prior two trends are finally dislodging the iron grip of Unix, and OSs are becoming interesting again!
3 notes
·
View notes
Link
0 notes
Link
0 notes
Link
0 notes
Link
0 notes
Link
0 notes
Link
2 notes
·
View notes
Link
0 notes
Link
1 note
·
View note
Link
0 notes
Link
0 notes
Text
Manifesto for my channel
I've been putting up a video a week on my YouTube channel since about December 2018, and I suppose I should write out a clear manifesto for why I'm doing that and where I want to go with it.
I started putting out videos for the same reason I have a blog: I want to share knowledge, and learn from the feedback. I have learned so much from stuff other people have published, and I want to put a few drops back into that ocean.
Also, committing to a regular cadence of putting out content (once a week, every week) is a calming antidote to the general chaos of modern life.
Why YouTube? Why videos? Because I feel that's where the river of attention is. And because it lets me experiment with a new medium. I believe Kevin Kelly when he says that screen literacy is a necessary modern skill, and should be taken seriously. I'm constantly amazed at what a fantastic place YouTube is for pedagogy, and the vast range of creators with their own unique voices and personalities putting out videos that can teach you everything under the sun.
As someone raised firmly within the walls-of-text tradition, it was very uncomfortable for me at first. But this was a different flavor of discomfort: this was unlike a fear a public speaking, which I don't have any problems with. I can't quite put my finger on it. My first video didn't even have a voiceover. My later ones have my talking head in a rectangle in the corner. Maybe I'll progress to the stage of being comfortable with me taking up the entire frame.
The majority of my videos so far have been part of my "Read a paper" series. I pick a technical paper, and walk through it, highlighting parts as I go along. So far they have all been computer science papers, with a strong bent towards old classics published in the late sixties and early-mid seventies, usually by Turing Award winners. I intend to branch out, both to more recent CS papers, as well as papers outside CS that I've found interesting and profound.
Why am I reading papers in videos? It was a format I instinctively gravitated towards. Only later could I shake an articulatable rationale out of my head. The thing is: for the longest time I've harbored deep misgivings about popular science and technology writing. I've come to the conclusion that if you really want to understand a new technical topic, you cannot rely on a pop-sci reading of it, and must bite the bullet and go read technical texts. And I realize that's hard and seems intimidating. I want to play the part of someone holding your hand as you walk through a technical paper, beginning to end, without simplifications, with the original words of the author in front of you. I guess this is where my walls-of-text upbringing comes to the fore again: even though these are videos, my goal is to pull you into the text, and stress the primacy of it.
I do eventually want to make more opinionated videos (kind of like the blog), but I need to get more comfortable with this medium before I can do that. Sticking to the straight technical details makes it a bit easier for me right now.
I can't get myself to beg for likes and subs on camera, but I'll do it here: I hope you'll join me on a whimsical journey into real technical details with real technical papers. Please subscribe!
0 notes