The tumblelog of Jan Krutisch. More at jan.krutisch.de
Don't wanna be here? Send us removal request.
Text
A razer update
Unfortunately I've been far too busy with actually doing things to frequently write updates on my experience with the Razer. So this is just a quick bullet point update to summarize the developments:
I'm mainly using Linux for development work now and it works great.
I'm running Ubuntu 18.10 by now which finally fixed a very annoying bug with the touchpad. I'm still switching Desktops, ran Budgie for a while and really liking the looks, but it has quite a few annoying bugs - Right now I'm using Cinnamon and liking it.
One of the main annoyances of Linux is the poor support for HiDPI displays, still. It mostly works but the odd application suddenly is tiny. And since there's no standard, official API to detect this per screen, you'll get all kinds of interesting effects when mixing different kind of screens, so when I have the machine connected to the big screen, I usually switch the Laptop screen off, so that everything looks ok.
There were a couple of issues with screen flicker, got those under control by basically following ALL OF THE ADVICE from rolandguelle's excellent documentation.
Tilix is a wonderful terminal for people like me who still haven't gotten on the tmux train.
Performance wise this machine is a beast, which is why, with a tear in my eye, I shut down my beloved Mid 2011 iMac, removed it from my desk and replaced it with a large screen my friend Benjamin is lending me.
Also, as my old iMac actually had spinning disks, my desk is now much quieter most of the time
I mostly moved to Linux for development because the Linux Subsystem for Windows kind of works, but is simply too slow. Working outside of that comes with the usual "doing development on Windows" quirks. It works for some task and I use it regularly to test on it.
The reboot times of these modern machines are ridiculously fast which makes running a dual boot system so much better.
I am surprised how good the driver situation is on linux. The Apple USB Ethernet cable just works, for example, while, on Windows, I had to download a dodgy driver from the original chip manufacturer and then click away all the warnings to be allowed to install that driver for the dongle. I was even able to find a driver for my ScanSnap S1300 document scanner.
The Razer itself is quite a nice hardware, with the keyboard being the obvious weak point: The feel itself is terrible and wobbly and the keyboard backlight only lights up the main characters, leaving you in the dark (literally) about the secondary functions, which is quite a bit of a hassle for someone who still struggles to get (again) used to the non Mac keyboard.
There are a couple of things I still struggle with on a day to day basis: The Wifi is slightly more unreliable than what I'm used to and on Linux bluetooth often fails after waking up from suspend to disk. The situation got a bit better with Ubuntu 18.10, often I can simply re-enable bluetooth instead of heading directly to a power cycle.
Unsolved (mostly because I haven't found the time to investigate the choices): Online and offline backup for Linux. Most of my work is in the cloud these days anyway, but still.
This still needs a more thorough writeup, but I wanted to at least quickly summarize my findings so far. All in all, I think I made a decent choice and definitely got a lot of performance for a lot less money than current MBP's (let's see what the impending Apple event brings, though) and with acceptable drawbacks (for now).
3 notes
·
View notes
Text
An Experiment starts here
I'm typing this on a new computer I bought yesterday. On Windows.
This is the first computer I intend to use as a work machine that is not an Apple Mac in about 12 years.
There are a couple of reasons for that and maybe I'll find the time to write down my reasoning in detail one day.
For now, I'm just planning on quickly documenting my findings, my journey, if you will, because I think it might be interesting for people. When ever I mentioned my plans on Twitter, I got a couple of replies from people who also thought about transitioning away from the Apple platform, so this seems to be a common theme right now.
First, a couple of details. The machine I've bought is a Razer Blade Stealth, a typical but also non typical entry in the so-called Ultrabook category, weighing in at about 1300g and having the typical 13" form factor. The most comparable machine is probably Dell's super popular XPS13.
I've bought the 512GB QHD+ version, with an 8th generation intel processor/chipset and 16GB of Memory.
Why did I choose the Razer. especially over the well established XPS13? Well, mostly because I found the device more interesting. It's a lot closer to Apple's design in a way, with an Aluminium unibody and a very minimal enclosure, but with a very unique design language that includes an RGB backlit keyboard (seems to be a gamer thing) and the tacky Razer logo on the back, but very subdued (and in a beautiful black color) approach otherwise.
The drawback seems to be that it will be much harder to run Linux on this thing without hacking around a lot. Initial experiments I did today were inconclusive, but a flickering screen (I couldn't really get fixed today) is not really something I want to endure day to day. So Windows it is for now. I'll continue my experiments later.
The first thing I set up (after the machine took almost 5 hours to apply all needed updates, including the spring creators update) was the Linux Subsystem, using Ubuntu. This alone (and the various weird aspects of it) will probably be worth a couple of more blog posts, but so far everything seems to work. I'm not entirely sure this is the right approach, so I'll test setting up a development environments in native Windows environments as well but I at least was able to set up a complete Rails environment within a couple of hours.
I've also installed a bunch of application software. Atom as my editor, the good old TotalCommander (I still had the license from back in the days!), EmClient as my first try for an Email client (which currently borks on my CalDav server for some reason), Slack, 1Password and a couple of other things. One thing of note is ConEmu, which is a pretty cool replacement for the standard Commandprompt thing in Windows and works well with the bash.exe thing that seems to be the entry point for the Linux subsystem.
One thing I noticed is that npm and yarn are really slow on this machine, at least with the way I've set this up within the Linux subsystem. I need to figure out what's going on there.
I didn't really try to actually develop on this machine yet and I'm pretty sure I'll stumble over different things when I do so, but that will have to wait for another post.
1 note
·
View note
Text
I ran into an interesting gotcha in Rails yesterday that I thought was worth documenting in a blog post.
I tried to use a rubygem with Rails 4.2 that used an interesting pattern to parse search queries (code paraphrased):
while querystring.sub!(/pattern/, '') phrase = $1 operator = $2 end
Now, this is, as far as I can see, using undocumented behaviour of Ruby. If you're not familiar with the $1, $2 things up there, they are usually used (and documented in that case) when using the regex operator like so:
/(12)/ =~ '1234' => 0 $1 => '12'
$1 represents the first match group, $2 the second and so on.
Now, in my code, the parsing produced empty strings when used in the web app. It took me a while to get why.
For reasons that are beyond this article, the input to the query parser was an ActiveSupport::SafeBuffer, the Rails way of solving script injections by creating a String subclass that will only ever contain "safe" strings.
If you look at the implementation for sub! on this class in Rails 4.2.7.1 (found in core_ext/string/output_safety.rb), you'll end up with this (the actual code looks a bit different, this is the code that is ultimately generated):
def sub!(*args) @html_safe = false super end
I don't know if you can see this on first try (I certainly didn't), but there's a subtle problem here, which causes exactly the effect I've experienced.
The $1, $1 variables are only available in the scope the method is called. That despite the fact that they look like globals. It makes sense to limit the scope of these "special" variables, of course, so there's nothing wrong with this behaviour. But, of course, this breaks the undocumented behaviour for sub!, as the variables are not available when calling ActiveSupport::SafeBuffer#sub!
My bug was fixed by calling to_str on the search query, which was fine for me conceptually.
But in general, you should probably not try to rely on undocumented behaviour. The code looks clever, but I personally dislike these special variables anyway, as they make your code hard to read. So what's the alternative?
For a second I though: "Hey, sub! also has a block form that allows you to do stuff with the matches before specifying the replacement as the block result. The code would roughly look like this (wow, the indentation is really weird there with the nested while and block):
while (querystring.sub!(/pattern/) { |string| phrase = $1 operator = $2 "" }) end
For the block form, the special variables are documented and in this case, the ActiveSupport replacement would theoretically work. Alas, it doesn't. Here things get interesting, because I think, in this case, this is actually a subtle bug in Ruby. I need to verify this with my own tests, but it looks to me as if the Ruby implementation of sub! has problems finding the correct scope for attaching the special variables.
Back to the drawing board. Luckily, it's actually quite simple:
querystring.scan(/pattern/) do |match| phrase = match[0] operator = match[1] end
Of course, this is not necessarily equal. For example, the used RegEx in the sub code could decide to scan from the back of the string. Luckily this is not the case, and the scan version actually works fine. The scan method itself is a bit weird, too. Why doesn't the block simply get a MatchData object? Luckily, if you need it, $~ is available in the block. And also, because SafeBuffer doesn't consider scan a dangerous method (as it doesn't mutate the string itself), it is not patched and the special variables work as expected.
I also think that the code reads a lot easier using scan, but that's a matter of taste. Adding to that, even if you dup the object in the beginning, mutating a string while analyzing it feels wrong.
Let's see if this project accepts pull requests.
UPDATE: I should have checked this, but my colleague linked me to it, of course there's already an issue filed. As if I would be the first to stumble over something :)
It references the "most obvious" solution (that I've also tried and discarded simply because I think it is not "the right" solution): call querystring.to_str to convert into a regular String prior t the while loop. Why do I think this is not the right solution? Well, first, it makes assumptions on the incoming data that I don't want to make. Second, like I said, the original code uses somewhat undocumented behavior, which should be a dealbreaker.
1 note
·
View note
Text
Lost time and a calendar hack
A few years ago, I gave my mum a photo calendar as a christmas present. Since then, every year, she gets an update in the form of 12 new photos and corresponding calendar sheets.
Until last year, I was using a software that's part of my old printer's software package, called "Canon Easy Photo Print" that has a calendar module which is easily customizable and flexible enough for my purpose.
This year, I wanted to use my new laser printer to print the calendar sheets - But of course Easy Photo Print is tied in to Canon printers.
Wonderful. So I took a look at various (6!) programs that looked similar, both from the internet and the Mac App Store.
I will spare you the names, because all of those programs are HORRIBLE. None of them were configurable enough or at least missing one option I found cruicial to achieve my design goals. Many of them were easy to chrash, most of them had really really terrible UI's. I don't know how much time I spent with this but I would assume at least 2 hours. 2 hours I will never get back.
After that, I gave in, sat down with the documentation of Prawn and started writing a small script to generate a pdf with the calendar. This was way simpler than I anticipated and took me about 30 minutes to build and another 30 minutes to customize so that the layout was perfect.
I've made a gist out of the script, the only thing missing is the font (I'm using DejaVuSerif). If I find the time, I'm going to turn this into a proper project. Not tonight.
1 note
·
View note
Text
Developer Happyness by example
I'm currently working on a project with some node.js (grunt) based workflow tools. To make the job a little more challenging, we're doing this project for a company that has all their developers running on Windows. Yes, that kind of project.
I'm not going to bash Windows here. I don't understand why anyone in their right mind would do web development on a non-unix machine, but hey, it's a free world and it's a Java project, so OS doesn't really matter.
Now, node.js has a set of core functions (in the "path" module) that correctly deal with path names, regardless of the underlying OS. So do python and ruby. All of them have something like path.join, path.basename, path.dirname, path.relative and path.absolute.
The fun begins, of course, when you add web technology. Most browsers (except IE of course) are quite allergic to Backslashes in URLs, even on Windows. Here's a list of problems I ran into so far:
Writing out Backslashes as asset URLs and hrefs in Web pages
Having to concat relative URLs in web pages (Forward slashes) with file system paths (Backslashes)
Writing out Backslashed paths to generated JavaScript files (for browserify) resulting in deescaped and thus eliminated backslashes
Comparing arrays of dependencies that contain backslashes with local, relative URLs that contain normal slashes, resulting in missing lookups.
Yadda, yadda, yadda
The reason all of these problems exist is because Node (and I assume python as well, I just can't check right now) simply expose the issue to the developer. All methods on the node "path" module on windows simply use backslashes. As you can imagine, that's fine as long as you're really only dealing with filesystem paths, which I am not.
Ruby, on the other hand, simply uses forward slashes all the time. For the most part, you never have to think about this and you can happily concat paths, even by hand, with forward slashes. I'm pretty sure this has it's own pitfalls, but I've never ran into one so far. And the compatibility to web technology is obvious and instant.
The contrast is stark: To replicate the functionality I've written in JavaScript with node.js in ruby, I would have to write a lot less code. Code that is meaningless to the intended function of my program, as I am not in the business of path mangling, but in the business of generating web pages for a product documentation. And meaningless code makes the program less readable and harder to understand. It obfuscates the original function with useless noise.
On the other hand, this is code I have to care about. Code that I need to test and code that will break. It keeps me from spending time on the actual functionality I need to implement.
This is not optimized for developer happiness. It's probably optimized for code size (as it is very close to the POSIX and NT metal), or for transparency, but this is guesswork. I do believe, though, that it's a mere symptom of an underlying design philosophy (or maybe even the absence of it). While this is the most annoying issue I've ran into so far, it is definitely not the only occasion where I had the "If I could only do this in ruby" train of thought.
I'm not writing this to bash on the node.js developers or the community. While I certainly disagree with some of the decisions that have been made (and are now very hard to change, I guess), I actually see ruby as an exception here. The genius of simply deferring the slash conversion to the latest possible moment (usually out of sight of the developer) stems from a clearly articulated and applied design philosophy (best expressed by the tagline "Optimized for developer happiness) which is quite outstanding. That's why I brought up the python example: I would have probably dealt with very similar issues on windows with a python stack. Languages vary wildly on this particular issue, so there's not really a right or wrong.
Here's a crazy one to close this article: As it seems, ImageMagick on Windows actually can't deal with backslashes and needs file name arguments in the forward slash form. Yup. Took me a while.
4 notes
·
View notes
Text
Visiting retune (and resonate) conference
Today is the last day of the retune conference I'm visiting. After resonate in Belgrade earlier this year, this is only my second of these arts/technology conferences. As someone who visited quite a bunch of tech conferences, I couldn't help but feel a little alienated by both conferences, which made me think a little more about why this is. In the following few paragraphs, I'll try to sum up my findings.
Before that, let me say that I don't want this to be understood as a critique of sorts. Different events are different and people seem to genuinely enjoy both types, so please take these as observations. While I do react negatively to some of these observations (as will be clear in the text), I don't see my own experience as in any way representative for visitors of these events. With this out of the way, here we go:
1. Presentation quality
With a few notable exceptions the presentation quality at both resonate and retune were quite low in contrast to most tech conferences. This manifests itself in many ways - For example, quite a few speakers were actually struggling a lot with the english language, to the point of me not being able to follow a speaker because of a heavy accent and broken flow. I've seen tech speakers struggle with english, but not on this scale. But not only that, some speakers weren't even bothering to prepare slides, erratically clicking around between either browser windows or even just files on their desktop. Maybe they thought that their work stands for itself and they couldn't be bothered to prepare better, but as I'm used to (mostly) precisely prepared, crafted and rehearsed talks of tech people, I feel annoyed by this. It feels like negligence to me.
Apart from that, many presentations violated simple preso 101 rules like using only strong contrasts, use big fonts (which was especially important, because the venue was quite long and so people looked at the screen from quite a distance.
Also, many of the presentations of retune had the tendency to be more "tell, don't show", than the other way round, with too few examples and too much theoretical talking - about motivations, which is fine, but also about processes which would have been much better shown in examples.
There's also a weird tangent to this: It feels to me as if the arts are way more into brining their work onto an academic level of sorts, something that only very seldom happens with tech talks.
2. Organisation and tech
I'll just explain this with a symptomatic weirdness from retune: Speakers are forced to use hand mics, while after the talk, one of the organizers takes a body mic for a walk through the audience for asking questions. You might laugh about this, but it is symptomatic for the way the whole technology part was treated. (Maybe I'm wrong and there was a good reason for that weird setup. I'll try to find out). But it is a bit weird for a scene that should have a lot of know how on audio technology. At resonate, the acoustics in some of the rooms were so bad that you only could understand the speakers if you were sitting really close to them. Sometimes, it took more than 10 minutes to set up the speakers laptop because nobody was there to help the speaker.
3. Catering and Venue
Both resonate and retune had no free catering. That's not a problem as such and is also reflected in the ticket prices (also, I suspect that a conference like retune has a lot more problems to find sponsors at the moment than most tech events) - But somehow it was a bit weird to then having to pay 2,50 EUR for a bottle of water. Catering was priced okay, I guess, I didn't even tried it. All in all, i liked the venues of the resonate, more or less, while the retune venue was way too dark for my liking, and every speaker was making jokes about how she/he was not able to see the audience. This artificial and probably unintentional divide has been countered by the very nice discussion format which allowed for longer interactions with the speakers afterwards.
4. Community
One of the more personal reasons that were a bit off putting for me was that I know so few people from this scene that I have problems connecting to people. This is clearly also my fault (In contrast to what some of you might think, I am actually socially awkward) and I may be wrong on this, but the scene didn't feel as welcoming as I feel about my own scenes (mostly Ruby and JavaScript at the moment). I would.
Did I enjoy retune or did I stay true to this grumpy self?
This all might read as if I didn't enjoy myself and that would be wrong. I've seen some great talks and performances, have been inspired (arguably, this effect was bigger at resonate, which also could be partially attributed to exploring a new city (Belgrade) at the same time) and got to know a few very lovely people. So there. I know where I feel at home, but of course it's important to leave your cozy home at time to get stimulated. Bot retune and resonate are able to do that, if you're interested in the intersection between arts and technology.
EDIT: I've corrected a few typos and also added a few paragraphs to better explain some things.
11 notes
·
View notes
Quote
Conference organizers, everyone: stop asking for and distributing slides. They're not synonymous with the talk, or its information.
@peterbourgon on twitter, via @chadfowler.
I see what he's getting at: Slides (and my talks usually fall into this category) are usually not worth a lot on their own. Nevertheless, I usually publish them, because they are at least very useful to the people I've been talking to: They contain a lot of links, image credits, and maybe they even help you remember stuff I was saying during the talk.
That being said, having a video of a talk, while still incomplete, is a much better way to document a talk. I nowadays often try to also write a blog post on the talk which is a much better format if you need text. (Which, in essence is what he wants you to do instead.)
In the end, I think publishing slides does serve a purpose, even if that purpose usually is not to work as a substitute for people who didn't see the talk. Am I missing something here?
0 notes
Text
Oh well. Mailgun, a service that I love to use to turn emails into webhook calls, does awesome things.
Not so awesome is that their API-Keys contain $-Signs:
key-dasd678asd7asdd$e2jd
(I made this one up, of course)
Now, imagine you didn't notice this on first sight. Now, imagine, you want to build your app for heroku, so you configure API-Keys and stuff with Environment-Variables.
So, to test it, you're doing this:
$ export MAILGUN_API_KEY=key-dasd678asd7asdd$e2jd
And now you're wondering why this does not work.
Or, you're using the wonderful Dotenv gem, which makes it a little easier to configure a lot of environment variables. And you create a .env file, which reads:
MAILGUN_API_KEY=key-dasd678asd7asdd$e2jd [...]
And you wonder why this doesn't work either.
And then you load up irb and you try this by hand:
> Dotenv.load => {"MAILGUN_API_KEY" => "key-dasd678asd7asdd"}
Whoah. What's happening here? And then it hits you: There's a Dollar sign in the original string. Suddenly, the export command makes sense, as Shells usually interpolate variables. But $e2jd is (hopefully) not a defined variable, so it gets cut out.
But what about Dotenv?
Turns out, A relatively large part of the relatively small Dotenv codebase does all kinds of substitutions to properly mimick shell behaviour. And that makes sense, of course.
The fix is easy: Add a backslash and escape the $-Sign. Backslashes are always the solution.
UPDATE: As Mailgun people on various channels (See comments) mentioned, they've changed their API key format a while back to fix this, I just had an old API key there. If you're having this problem: Regenerating the API key should fix it. Thanks for the quick response to Mailgun!
0 notes
Link
For some reason, after updating an App to Rails 4.1.5, Heroku stopped serving static assets.
It didn't really cross my mind to look out for the X-Sendfile header in the empty response I got, and so we spent a whole lot of time trying out other things, cursing at Heroku. After almost giving up, suddenly the X in that curl response stood out.
Looking at the environment configuration, I looked at x_sendfile_header, and surely it was set. As this project was hosted on Heroku for as long as any team member could think, I tried to find the change that introduced the x_sendfile_header config but gave up after a while.
The only reason, that it didn't bite us before was that only after the linked commit Rails started to use X-Sendfile for assets.
Growl.
It's not that I resent that change. It doesn't REALLY make sense, as you would usually be able to serve assets directly with a webserver if you have a webserver that understands X-Sendfile, but that's none of my business.
I have no idea why the sendfile config was in there. But this goes to show that even small changes, none of which were made in error, can ruin two developers' day.
0 notes
Link
Via @amasoean
I'll add that I'd never work for a military contractor, which google, at the latest since the Acquisition of Boston Dynamics, most certainly is.
(I've been poked once by a google recruiter some years ago. I'm not sure I am google material, though.)
0 notes
Text
Jan's quick pasta sauce
When I'm out of options on what to cook for lunch (which usually is on the quick side of things anyway), I usually have some Pasta left somewhere in my drawers. Pasta sauces can be a lengthy affair (If you have never done that, you should try a real homemade bolognese sugo, which ideally cooks for a pretty long time), but this one is really quick, and I usually do them freestyle. I tried to remember what I did this time, because it was extremely tasty:
Put a generous amount of olive oil into a small pan on medium heat. The ingredients are supposed to swim in oil. Remember: Olive oil = good fat.
Add Finely chopped garlic to taste (A good rule of thumb is one clove per person). I messed this up and only chopped it into slices, which works, but small pieces is better.
Add dried chilies finely chopped or broken, also to taste. As this is largely dependent on your taste AND the potential of your chili, I can't seriously give you any recommendations here
Let it simmer for some time, so that the garlic is cooked, but not brown
Add a tea spoon or two of tomato paste. It won't easily dissolve, but that will be fixed later
Add some dried herbs. I used home grown oregano, I assume basil would work as well
When the pasta is done, add 2-3 table spoons of the pasta water before straining the pasta, this will dissolve the paste and bind the whole thing together. Add a bit of salt.
If possible, pour the strained pasta directly into the pan and give it a good stir so that the sauce is well distributed.
serve, add hard cheese to taste if you must
Have some yoghurt ready for eventual dosage errors on the chili side.
Simple, yet effective. That's why I love pasta.
0 notes
Video
vine
A trip to IKEA ended with me getting a new version of their DIODER color LED contraption. It's basically a strip (or, in the new version a puck like structure) with a load of RGB LEDs that are driven by a small and simple PIC based controller unit that lets you either set a color by using a dial or choose between two different ways of cycling through ALL THE COLORS!!!
It's a fun toy, but since the circuit is so simple (it's basically power supply, microcontroler plus controls and the LED drivers), it's also very easy to hack. The end game, of course, is to address each light source (Both DIODER sets contain four of them) individually, but for that you would need a dedicated driver chip. But what's easy is to remove the PIC microcontroller and the pot, attach an Arduino to the three channels and the PIC power supply and Bam! you can drive the DIODER set with the PWM channels of the Arduino.
The next step was to add the Ethernet shield and to drive the LEDs via UDP. My version of the Ethernet shield unfortunately doesn't play nice with the DIODER power suppy, so currently I need to have the USB connected, but hey.
I devised the simplest UDP protocol I could come up with and wrote a little ruby script to send some packets.
Next, I used the rosc gem to let the ruby script open up an OSC server. A little test with TouchOSC and a small, three slider control surface was already pretty cool, but as I saw that latency was pretty much non existent with this setup, I wanted to go further.
So I built a little drum pattern with Ableton Live, added an additional MIDI track and painted some rhythmic controller envelopes on CTRLCHG 1, 2 and 3. I then (first iteration) used OSCulator, a pretty cool tool to basically convert every control signal to every control signal to send OSC to my script. In the second iteration, I've at least eliminated OSCulator by finally learning me some Max 4 Live.
The effect of RGB strips and pucks flashing synchronously to the music is pretty mesmerizing, I have to say.
The used code can be found in this gist.
I was really happy with the result and also with the fact that I finally started looking at hardware hacking again, even if it was just a small project.
0 notes
Text
Safari on iOS7 caches non 2xx responses. yay.
I've not been able to research this in depth so far, but it seems as if mobile Safari on iOS7, under unclear circumstances, caches responses even if they are non OK responses.
This is clearly wrong and broke our nginx-auth_request based login system: It renders a 403 error with a (mozilla persona) login button after authentication failed - After login, the user is redirected to the original address and this should trigger a full reload (which would then pass authentication and render the real web page). In our case, it just loaded the login page from cache, which, due to persona's behaviour, would trigger the login dance, which would redirect to the original address which...you get the idea.
Fixable by setting no-cache headers when rendering out the 403 page, but still pretty annoying.
0 notes
Text
A quick experiment in deserializing HTML to initialize backbone collections
In Preperation of my Talk at the Railswaycon 2012, I hacked together something I was thinking of for a long time. What if you could let your server render out complete pages (for sake of searchability, for example) and then initialize a backbone collection from that HTML? Your backbone app could then augment the statically rendered page with it's own magic. [Here's a gist](https://gist.github.com/2820063). I'm going to publish it including examples and tests later on.
3 notes
·
View notes
Quote
F*CK YOU GOOGLE!!! http://t.co/ap0vyaP1 #safari
sez [@johnny_cash](http://twitter.com/johnny_cash) I say: F*CK YOU APPLE!!!. The Web Audio API has been available as a webkit patch for something that feels like at least 3 years and is (correct me if I'm wrong) available in stable Chrome for at least a year, or even one and a half (it has been present in beta versions for at least two). Firefox and Google (and even Opera) are innovating the hell out of the web platform and all that Apple does is fixing the most blatant security issues. So, Googles "Install a modern browser" may be a bit rude, or maybe even cheeky, but it's not completely unfounded. This all wouldn't be much of a problem as long as we only talk about desktop browsers because nobody uses Safari there anyway (Market share is really small), but then there's the whole mobile space with the iPad clearly dominating the space and Apple not allowing other browser engines onto the platform. This situation actively hurts the web - As much as I love my iPad and and as much as Apple did in the first place to make mobile browsing bearable and popular - But this needs to change.
2 notes
·
View notes
Link
0 notes
Link
0 notes