Don't wanna be here? Send us removal request.
Text
Guess latest Chrome version based on date?
Using past two years of chrome releases from wikipedia: https://en.wikipedia.org/wiki/Google_Chrome_version_history
Curve fit with almost perfect fit, take the simple y = mx + b and we can guess the latest version of chrome based on the current epoch.
javascript fiddle proof of concept: https://jsfiddle.net/artfulhacker/e6c1qmff/
var epoch = Date.now() / 1000; var slope = 0.000000251; var yintercept = -316; alert(Math.floor(epoch * slope + yintercept));
0 notes
Text
Beware, even things on Amazon come with embedded malware...
I needed a simple set of good outdoor surveillance cameras for a friends home. Like everything else I buy, I turned to Amazon. I found (what seemed like) a great deal for a set of 6 poe cameras and the necessary recording equipment. Here is the link:

http://www.amazon.com/Sony-Chip-Camera-1080P-CCTV/dp/B00YMEVSGA
When trying to get the cameras to work on my friends machine I simply logged into the admin webpage and went to configure it. First of all something seemed a bit off, the interface showed the camera feed but none of the normal controls or settings were available. Being one of those guys who assumes bad CSS, I went ahead and opened up developer tools. Maybe a bad style was hiding the options I needed. Instead what I found tucked at the bottom of the body tag was an iframe linking to a very strange looking host name. See the host name in the screenshots below:
At this point I went ahead and googled the domain, and guess what came up? If you read the title you have an idea already.
Malware. See this page or just simply go through this google search for more information. My guess is many people have missed this. The seller has great ratings and the products are a good deal. So be careful what you buy!
Looks like it was pointed out in a forum a month ago here: http://forums.whirlpool.net.au/forum-replies.cfm?t=2362073&p=11&#r211
TLDR; First off, 223 words... really?? Second, Amazon stuff can contain malware.
2 notes
·
View notes
Text
a year?
it has officially been a year since my last post...
#thestartuplife
0 notes
Text
Azure CDN now Supports CORS
So we can finally host fonts in our Azure CDN!!!
This goes for anything else subject to cross-origin resource sharing restrictions.
So update your azure storage client library to (> 3.0) and do something like this:
var client = storageAccount.CreateCloudBlobClient(); var properties = client.GetServiceProperties(); var cors = new CorsRule(); cors.AllowedOrigins.Add("artfulhacker.com"); cors.AllowedMethods = CorsHttpMethods.Get; cors.MaxAgeInSeconds = 3600; properties.Cors.CorsRules.Add(cors); client.SetServiceProperties(properties);
If you want to allow any parent calling domain use this instead:
cors.AllowedOrigins.Add("*");
More information on this can be found here:
http://msdn.microsoft.com/en-us/library/windowsazure/dn535601.aspx
0 notes
Text
What are some amazing secrets hidden within Disneyland and Disney World?
Answer by Mike Olsen:
When Walt originally purchased the land for Disneyland from the Dominguez family they had a stipulation. He had to keep a date palm that was presented to the family in 1896 from the Canary Islands. The tree was a wedding gift and held great sentimental value. To me what makes this an amazing hidden secret is the fact that Disneyland has still taken great care of preserving it, even now over a hundred years old it is not only still at the park but you can even touch it! Next time you visit the park, go get a fastpass for Indiana Jones, just before you enter the fast pass area, look to the right and you will see it (along with some bamboo growing next to it).
This is an "amazing hidden secret" because it still shows the respect and values of the Disney company. They have owned the land long enough, this tree could easily have been removed by now. You can even see the great care taken when the new Jungle Cruise queue was built.
View Answer on Quora
0 notes
Text
Simple JSON Array Methods in C#
Echovoice.JSON
Echovoice JSON Array Encode, Decode and Pretty methods. Used internally until the public release of WS3V.
This library is available on Nuget as Echovoice.JSON.
Why?
Json.net was too big for simple JSON array encoding and decoding, plus the way to use it was far too complex.
JavascriptSerializer uses the odd JsonArray class, all we wanted was simple strings, arrays or numbers.
Decode Usage
Simple JSON array to string array JSONDecoders.DecodeJsStringArray()
string input = "[\"philcollins\",\"Ih8PeterG\"]"; string[] result = JSONDecoders.DecodeJsStringArray(input); </code>
result[0]: philcollins
result[1]: Ih8PeterG
Complex JSON Array JSONDecoders.DecodeJSONArray()
string input = "[14,4,[14,\"data\"],[[5,\"10.186.122.15\"],[6,\"10.186.122.16\"]]]"; string[] result = JSONDecoders.DecodeJSONArray(input); string[] result2 = JSONDecoders.DecodeJSONArray(result[3]); </code>
result[0]: 14
result[1]: 4
result[2]: [14,"data"]
result[3]: [[5,"10.186.122.15"],[6,"10.186.122.16"]]
result2[0]: [5,"10.186.122.15"]
result2[1]: [6,"10.186.122.16"]
Encode Usage
Simple object to JSON Array EncodeJsObjectArray()
public class dummyObject { public string fake { get; set; } public int id { get; set; } public dummyObject() { fake = "dummy"; id = 5; } public override string ToString() { StringBuilder sb = new StringBuilder(); sb.Append('['); sb.Append(id); sb.Append(','); sb.Append(JSONEncoders.EncodeJsString(fake)); sb.Append(']'); return sb.ToString(); } } dummyObject[] dummys = new dummyObject[2]; dummys[0] = new dummyObject(); dummys[1] = new dummyObject(); dummys[0].fake = "mike"; dummys[0].id = 29; string result = JSONEncoders.EncodeJsObjectArray(dummys); </code>
Result: [[29,"mike"],[5,"dummy"]]
Pretty Usage
Pretty print JSON Array PrettyPrintJson() string extension method
string input = "[14,4,[14,\"data\"],[[5,\"10.186.122.15\"],[6,\"10.186.122.16\"]]]"; string result = input.PrettyPrintJson(); </code>
Result:
[ 14, 4, [ 14, "data" ], [ [ 5, "10.186.122.15" ], [ 6, "10.186.122.16" ] ] ]
0 notes
Text
Use Zopfli Compression in your CDN
Several months ago Google released Zopfli, which is a slow (~100 times slower than normal deflate) a compression algorithm that achieves about 4-8% better compression.
Zopfli has also been adapted to be used in PNG compression:
ZopfliPNG will do nothing more that read a png file, re-compress the DEFLATED parts (IDAT image data, and if the tool matures other compressed chunks like iTXt, zTXt, and iCCP) and write the modified file.
When should we use this? The best place to use Zopfli is on your static content, so we choose to use it on our Azure CDN.
Zopfli is a compression algorithm that is compatible with the DEFLATE algorithm used in zlib, allowing it to be used seamlessly with already deployed programs and devices that support the standard. Zopfli produces files that are 4-8% smaller than zlib at the expense of being substantially slower to compress a file than other implementations of the DEFLATE algorithm.
Perfect fit, since CDN files are only uploaded once and compression only needs to happen on upload. Clients downloading compressed files will decompress them using the normal DEFLATE supported everywhere at the same speed (or faster, since they will be smaller in size).
Our first step was to get Zopfli into C#. So we created a wrapper and released it under Apache 2.0
https://github.com/echovoice/libzopfli-sharp
Nuget Package is also found here: https://www.nuget.org/packages/libzopfli-sharp
PNG Compression Usage
If you are working with .Net Image objects simply call SaveAsPNG() Image extension method.
Image testImage = Image.FromFile("files/ev.png"); testImage.SaveAsPNG(path_to_save_compressed_PNG); </code>
You can compress *.PNG files directly using the ZopfliPNG.compress() method.
string path = "files/ev.png"; ZopfliPNG.compress(path); </code>
We also implemented a derived class of Stream called ZopfliPNGStream
byte[] uncompressed = File.ReadAllBytes("files/ev.png"); int before = uncompressed.Length; byte[] compressed; int after = 0; using (MemoryStream compressStream = new MemoryStream()) using (ZopfliPNGStream compressor = new ZopfliPNGStream(compressStream)) { compressor.Write(uncompressed, 0, before); compressor.Close(); compressed = compressStream.ToArray(); after = compressed.Length; } </code>
In addition to using the default compression options Zopfli exposes some additional options to fine tune compression. We extended this in the ZopfliPNGOptions object.
public class ZopfliPNGOptions { // Allow altering hidden colors of fully transparent pixels public Boolean lossy_transparent; // Convert 16-bit per channel images to 8-bit per channel public Boolean lossy_8bit; // Filter strategies to try public ZopfliPNGFilterStrategy[] filter_strategies; // Automatically choose filter strategy using less good compression public Boolean auto_filter_strategy; // PNG chunks to keep // chunks to literally copy over from the original PNG to the resulting one public String[] keepchunks; // Use Zopfli deflate compression public Boolean use_zopfli; // Zopfli number of iterations public Int32 num_iterations; // Zopfli number of iterations on large images public Int32 num_iterations_large; // 0=none, 1=first, 2=last, 3=both public Int32 block_split_strategy; } </code>
Gzip, Deflate and Zlib Compression Usage
For all 3 compression types we implemented a derived class of Stream ZopfliStream
byte[] uncompressed = File.ReadAllBytes("files/fp.log"); int before = uncompressed.Length; byte[] compressed; int after = 0; using (MemoryStream compressStream = new MemoryStream()) using (ZopfliStream compressor = new ZopfliStream(compressStream, ZopfliFormat.ZOPFLI_FORMAT_DEFLATE)) { compressor.Write(uncompressed, 0, before); compressor.Close(); compressed = compressStream.ToArray(); after = compressed.Length; } </code>
The second parameter for our derived Stream class is the type of compression to use.
public enum ZopfliFormat { ZOPFLI_FORMAT_GZIP, ZOPFLI_FORMAT_ZLIB, ZOPFLI_FORMAT_DEFLATE }; </code>
In addition to using the default options Zopfli exposes some additional options used to fine tune compression. We extended this in the ZopfliOptions object which can also be passed into the Stream.
public class ZopfliOptions { // Whether to print output public Int32 verbose; // Whether to print more detailed output public Int32 verbose_more; // Maximum amount of times to rerun forward and backward pass to optimize LZ77 // compression cost. Good values: 10, 15 for small files, 5 for files over // several MB in size or it will be too slow. public Int32 numiterations; // If true, splits the data in multiple deflate blocks with optimal choice // for the block boundaries. Block splitting gives better compression. Default: // true (1). public Int32 blocksplitting; // If true, chooses the optimal block split points only after doing the iterative // LZ77 compression. If false, chooses the block split points first, then does // iterative LZ77 on each individual block. Depending on the file, either first // or last gives the best compression. Default: false (0). public Int32 blocksplittinglast; // Maximum amount of blocks to split into (0 for unlimited, but this can give // extreme results that hurt compression on some files). Default value: 15. public Int32 blocksplittingmax; } </code>
Where to go from here?
Now that we have Zopfli working in C# you will want to wrap any of your Azure Blob uploads with the Zopfli stream.
// Retrieve storage account from connection string. CloudStorageAccount storageAccount = CloudStorageAccount.Parse( CloudConfigurationManager.GetSetting("StorageConnectionString")); // Create the blob client. CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); // Retrieve reference to a previously created container. CloudBlobContainer container = blobClient.GetContainerReference("mycontainer"); // Retrieve reference to a blob named "myblob". CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob"); // Create or overwrite the "myblob" blob with contents from a local file. byte[] uncompressed = File.ReadAllBytes(@"path\myfile"); int before = uncompressed.Length; byte[] compressed; // test deflate stream compression code using (MemoryStream compressStream = new MemoryStream()) using (ZopfliStream compressor = new ZopfliStream(compressStream,ZopfliFormat.ZOPFLI_FORMAT_DEFLATE)) { compressor.Write(uncompressed, 0, before); compressor.Close(); blockBlob.UploadFromStream(compressStream); }
from MSDN http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/
MSDN has a great write up on enabling the CDN from a Blob.
http://www.windowsazure.com/en-us/develop/net/common-tasks/cdn/
If you still need help uploading to Azure or how to determine if compression is supported I would suggest viewing this (slightly outdated) tutorial:
http://joelfillmore.com/serving-gzip-compressed-content-from-the-azure-cdn/
0 notes
Text
Fix Pixelated Font Icons in Chrome on Windows
Firstly if you have not made the switch to font based icons, stop reading and check out this (awesome) free icon pack:
http://fortawesome.github.io/Font-Awesome/
So why use fonts instead of images?
1.) Any size, clean and crisp at any resolution.
2.) Small, font files are much smaller than a series of images, especially for those creating double sized images to support Retina displays.
3.) Supported by all legacy, and modern browsers, any platform.
Before you jump in, you should really check out IcoMoon by Keyamoon. I have never used a free web app of this quality before, the author has done a really good job.
Getting back on topic, if you read here on IcoMoon’s blog: http://icomoon.io/#post/318 you will discover Chrome running under Windows has some real problems with font rendering.
Here are some samples from Chrome, IE and Firefox (all on windows)
Chrome [Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.76 Safari/537.36]:
IE10 [Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; WOW64; Trident/6.0)]:
FF20 [Mozilla/5.0 (Windows NT 6.2; WOW64; rv:20.0) Gecko/20100101 Firefox/20.0]:
How do we fix this?
So the initial solution is to move SVG to the top:
@font-face { font-family: 'echovoice'; src: url('/fonts/echovoice.svg#echovoice') format('svg'); ....
Why is this BAD?
"SVG fonts lack compression and therefore they are very large in size compared to other font formats"
Fine, lets use CSS to detect Chrome and force it to use SVG (from stackoverflow.com)
@font-face { font-family: 'echovoice'; src:url('/fonts/echovoice.eot'); src:url('/fonts/echovoice.eot?#iefix') format('embedded-opentype'), url('/fonts/echovoice.woff') format('woff'), url('/fonts/echovoice.ttf') format('truetype'), url('/fonts/echovoice.svg#icomoon') format('svg'); font-weight: normal; font-style: normal; } @media screen and (-webkit-min-device-pixel-ratio:0) { @font-face { font-family: 'echovoice'; src: url('/fonts/echovoice.svg#echovoice') format('svg'); } }
Up until now all I have done is reiterate IcoMoon’s blog post.. now for the reason I decided to write this post…
If you take a look at the network activity under Chrome you will see the following:
So it loads both the svg and woff, this isn’t ideal at all.
My solution is to move the above CSS to a separate stylesheet (echovoice.css), now that noscript tags are legal inside the head
<noscript> <link href='/fonts/echovoice.css' rel='stylesheet' type='text/css'> </noscript>
This will allow users with js disabled to pull in the styles to load the fonts.
Then I added the following directly below the noscript tags in the head for our js users
<script type="text/javascript"> var b="<style>@font-face{font-family:'echovoice';src:";/win/.test(navigator.userAgent.toLowerCase())&&/chrom(e|ium)/.test(navigator.userAgent.toLowerCase())||(b+="url('/fonts/echovoice.eot');src:url('/fonts/echovoice.eot?#iefix') format('embedded-opentype'),url('/fonts/echovoice.woff') format('woff'),url('/fonts/echovoice.ttf') format('truetype'),");b+="url('/fonts/echovoice.svg#icomoon') format('svg');font-weight:normal;font-style:normal}</style>"; document.write(b); </script>
What this simply does is test for the combination of chrome and windows, if that combination is found then it will only write the svg path for src, otherwise it will write paths for all the font options.
Success, only the SVG file is downloaded in Chrome under Windows and no more fuzzy icons!
0 notes
Text
How to delete a project in Team Foundation Service (tfs.visualstudio.com)
UPDATE: As of today there still isn’t a UI based option to remove a project in Team Foundation Service.
http://stackoverflow.com/questions/13635889/delete-team-project-from-team-foundation-service/20061822#20061822
So how do we do it?
Turns out to be much simpler than I had mentally prepared myself for.
Open Administrator Command Prompt (win+x a), change directory to:
C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE
The run the following command substituting the values inside the [ ] brackets.
TFSDeleteProject.exe /collection:"https://[name].visualstudio.com/DefaultCollection" "[project name]"
You should then see a warning, press y to delete the project. Done.
Note: I have VS2012 installed, and VS is already joined to my Team Foundation Service.
0 notes
Text
Latest Web UI Trend: Ajax Loading Bars
When loading content in via ajax web browsers do not usually give clear indication they are fetching additional content. This is typically found with endless scrolling sites, however the latest UI trend as seen on Medium and Youtube is a top animated loading bar when fetching additional content. Here is a screenshot of the medium example:
So how do they do it?
Well lets start with some basic HTML/CSS. When using developer tools in chrome on the page I noticed that they kept adding a class to the body tag:
app-loading
So I added the class manually and the loading bar showed up and stayed put.Still using developer tools I found the div that made up the loading bar down at the bottom of the html right above the closing script tags. here is the HTML that makes up the loading bar:
<div class="loading-bar"></div>
Super simple so far, now the CSS, by default the style applied is:
.loading-bar { position: fixed; display: none; top: 0; left: 0; right: 0; height: 2px; z-index: 800; background: #60d778; -webkit-transform: translateX(100%); -moz-transform: translateX(100%); -o-transform: translateX(100%); transform: translateX(100%); }
Notice the display: none; This hides the bar from view initially. When we continue to look through the CSS we see that app-loading class overrides this default display: none.
.app-loading .loading-bar { display: block; -webkit-animation: shift-rightwards 1s ease-in-out infinite; -moz-animation: shift-rightwards 1s ease-in-out infinite; -ms-animation: shift-rightwards 1s ease-in-out infinite; -o-animation: shift-rightwards 1s ease-in-out infinite; animation: shift-rightwards 1s ease-in-out infinite; -webkit-animation-delay: .4s; -moz-animation-delay: .4s; -o-animation-delay: .4s; animation-delay: .4s; }
So by adding the app-loading class to the body while loading content into the page, you effectively get an animated loading bar fixed to the top of the page.
Last code sample is the CSS3 animation we “hijacked” from Medium, they called it: shift-rightwards, so lets give that developer credit and keep the name in the demo.
@-webkit-keyframes shift-rightwards { 0% { -webkit-transform:translateX(-100%); -moz-transform:translateX(-100%); -o-transform:translateX(-100%); transform:translateX(-100%); } 40% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 60% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 100% { -webkit-transform:translateX(100%); -moz-transform:translateX(100%); -o-transform:translateX(100%); transform:translateX(100%); } } @-moz-keyframes shift-rightwards { 0% { -webkit-transform:translateX(-100%); -moz-transform:translateX(-100%); -o-transform:translateX(-100%); transform:translateX(-100%); } 40% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 60% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 100% { -webkit-transform:translateX(100%); -moz-transform:translateX(100%); -o-transform:translateX(100%); transform:translateX(100%); } } @-o-keyframes shift-rightwards { 0% { -webkit-transform:translateX(-100%); -moz-transform:translateX(-100%); -o-transform:translateX(-100%); transform:translateX(-100%); } 40% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 60% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 100% { -webkit-transform:translateX(100%); -moz-transform:translateX(100%); -o-transform:translateX(100%); transform:translateX(100%); } } @keyframes shift-rightwards { 0% { -webkit-transform:translateX(-100%); -moz-transform:translateX(-100%); -o-transform:translateX(-100%); transform:translateX(-100%); } 40% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 60% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 100% { -webkit-transform:translateX(100%); -moz-transform:translateX(100%); -o-transform:translateX(100%); transform:translateX(100%); } }
Easy enough, so how can we apply this to our own websites?
Just before making an ajax request add the class app-loading to the < body / > tag.
After the ajax has finished (make sure also to catch on timeout or error) remove the class from the body tag.
Full Working Example using jQuery:
http://jsfiddle.net/MuVaa/
<!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <title>Loading Bar Sample</title> <style> .app-loading .loading-bar { display: block; -webkit-animation: shift-rightwards 1s ease-in-out infinite; -moz-animation: shift-rightwards 1s ease-in-out infinite; -ms-animation: shift-rightwards 1s ease-in-out infinite; -o-animation: shift-rightwards 1s ease-in-out infinite; animation: shift-rightwards 1s ease-in-out infinite; -webkit-animation-delay: .4s; -moz-animation-delay: .4s; -o-animation-delay: .4s; animation-delay: .4s; } .loading-bar { position: fixed; display: none; top: 0; left: 0; right: 0; height: 2px; z-index: 800; background: #60d778; -webkit-transform: translateX(100%); -moz-transform: translateX(100%); -o-transform: translateX(100%); transform: translateX(100%); } @-webkit-keyframes shift-rightwards { 0% { -webkit-transform:translateX(-100%); -moz-transform:translateX(-100%); -o-transform:translateX(-100%); transform:translateX(-100%); } 40% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 60% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 100% { -webkit-transform:translateX(100%); -moz-transform:translateX(100%); -o-transform:translateX(100%); transform:translateX(100%); } } @-moz-keyframes shift-rightwards { 0% { -webkit-transform:translateX(-100%); -moz-transform:translateX(-100%); -o-transform:translateX(-100%); transform:translateX(-100%); } 40% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 60% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 100% { -webkit-transform:translateX(100%); -moz-transform:translateX(100%); -o-transform:translateX(100%); transform:translateX(100%); } } @-o-keyframes shift-rightwards { 0% { -webkit-transform:translateX(-100%); -moz-transform:translateX(-100%); -o-transform:translateX(-100%); transform:translateX(-100%); } 40% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 60% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 100% { -webkit-transform:translateX(100%); -moz-transform:translateX(100%); -o-transform:translateX(100%); transform:translateX(100%); } } @keyframes shift-rightwards { 0% { -webkit-transform:translateX(-100%); -moz-transform:translateX(-100%); -o-transform:translateX(-100%); transform:translateX(-100%); } 40% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 60% { -webkit-transform:translateX(0%); -moz-transform:translateX(0%); -o-transform:translateX(0%); transform:translateX(0%); } 100% { -webkit-transform:translateX(100%); -moz-transform:translateX(100%); -o-transform:translateX(100%); transform:translateX(100%); } } </style> </head> <body> <div>random content here</div> <div class="loading-bar"></div> <script src="//ajax.googleapis.com/ajax/libs/jquery/1.9.1/jquery.min.js"></script> <script> $(function() { $("body").addClass("app-loading"); $.get("some-test-page.html", function(data) { $("body").removeClass("app-loading"); }); }) </script> </body> </html>
0 notes
Text
Windows Azure Caching Failure (ES0006)
So this issue has been driving us nuts for almost 2 weeks, Azure updated its NuGet package for Caching, and anyone who recently updated it but still had projects running SDK 2.0 will most likely miss this highlighted section:
This is the error message we kept seeing repeatedly in the cloud as well as the emulator:
<ES0006>:There is a temporary failure. Please retry later. (One or more specified cache servers are unavailable, which could be caused by busy network or servers. For on-premises cache clusters, also verify the following conditions. Ensure that security permission has been granted for this client account, and check that the AppFabric Caching Service is allowed through the firewall on all cache hosts. Also the MaxBufferSize on the server must be greater than or equal to the serialized object size sent from the client.)
This fix? Simply update your azure SDK to >2.0
I hope this helps anyone having this same issue!
0 notes
Text
Regex-less JSONP callback validation via LINQ magic
For JSONP calls we should always restrict and validate the callback parameter to prevent code injections and other hacker attacks.
To do this we usually set a max size and only allow alphanumeric characters and underscores.
Most developers would turn to regex but I prefer to do things regex-less, in C# we can validate the string callback as seen in this code snippet below using some LINQ magic:
if (!callback.ToCharArray().All(c => Char.IsLetter(c) || Char.IsNumber(c) || c == '_'))) return "illegal callback, can only contain alphanumeric characters and underscores";
0 notes
Text
How to get HttpContextBase from HttpContext.Current in the Global.asax?
In order to implement server-side google analytics tracking into our API for request monitoring we decided to give this library a try:
https://github.com/maartenba/GoogleAnalyticsTracker
In order to get this example below to work inside the Global.asax file
Tracker tracker = new Tracker("UA-XXXXXX-XX", "www.example.org"); tracker.TrackPageView(HttpContext, "My API - Create");
I needed to somehow cast HttpContext.Current into the abstract class HttpContextBase.
Turns out it is as simple as:
new HttpContextWrapper(HttpContext.Current)
so the example becomes:
Tracker tracker = new Tracker("UA-XXXXXX-XX", "www.example.org"); tracker.TrackPageView(new HttpContextWrapper(HttpContext.Current), "My API - Create");
0 notes
Text
QUnit unit testing in js, window.location mocks
So recently I need to unit test some js code but this code used the window.location.host variable, and in order to fully test the logic I had to mock the window.location global var.
In case anyone ever has to do something similar here is a snippet that works with QUnit:
test( "test initial mocks", function() { custom_window = { location: { host: "test.asu.edu" } }; (function(window) { equal( window.location.host, "test.asu.edu", "we expect test.asu.edu to be in the window.host mock" ); })(custom_window); });
0 notes
Text
Google Analytics QUnit testing without needing ga.js
I recently needed to do some QUnit testing on some code that interacted with google analytics in javascript.
Specifically, firing some standard ga logic that read a custom var (slot 4) using this:
_gat._getTrackerByName()._getVisitorCustomVar(4);
In order to not load ga.js in the test I decided to mock these methods so that the code could be tested in QUnit, the following code will set the custom var to the string ‘asu’ so it can be read later in the code using the normal custom var method:
var customvar4 = undefined; function _tracker() { this._getVisitorCustomVar=_getVisitorCustomVar; function _getVisitorCustomVar(i) { return customvar4; } } // test the initial mocks test( "test initial mocks", function() { // setup ga mocks custom_gat = { _getTrackerByName: function() { return new _tracker(); } } customvar4 = 'asu'; function _tracker() { this._getVisitorCustomVar=_getVisitorCustomVar; function _getVisitorCustomVar(i) { return customvar4; } } (function(_gat) { equal( _gat._getTrackerByName()._getVisitorCustomVar(4), "asu", "we expect asu to be in the ga mock" ); })(custom_gat); });
Obviously this can be expanding on by adding more of the ga methods but this should help you get started. Also you will notice I ignore the slot number, add a switch to the method if you need specific slot data.
0 notes
Text
How do I exclude a specific files from TFS 2012 source control?
It is way easier that in previous versions of TFS, simply create a .tfignore file.
For example:
###################################### # Ignore .cpp files in the ProjA sub-folder and all its subfolders ProjA\*.cpp # # Ignore .txt files in this folder \*.txt # # Ignore .xml files in this folder and all its sub-folders *.xml # # Ignore all files in the Temp sub-folder \Temp # # Do not ignore .dll files in this folder nor in any of its sub-folders !*.dll
0 notes
Text
Determine if a class is running in Azure and if it is actually in the cloud vs. the emulator
We needed to determine if a class was running in Azure and if so whether it was actually in the cloud or being emulated locally.
Turns out this is very simple to do, just add a reference to:
Microsoft.WindowsAzure.ServiceRuntime
And then use these two static booleans, notice the try/catch blocks, 32 bit processes throw an error here, so reporting false in the catch remedies that.
public static bool InAzureEnvironment { get { try { return RoleEnvironment.IsAvailable; } catch { return false; } } } public static bool InCloud { get { try { return InAzureEnvironment && !RoleEnvironment.IsEmulated; } catch { return false; } } }
0 notes