#how to calculate sum of column values in javascript
Explore tagged Tumblr posts
Text
How to calculate sum of column in jquery
How to calculate sum of column in jquery
In this article, we learn about how to calculate sum of column in jquery, or you can also say how to calculate the value when giving input to the column. In this article we will jquery version 3.X and bootstrap version 4 to give some feel to our form. You will learn about each function() , parse column data in this article. Below is the basic html code which includes cdn in their…
View On WordPress
#"calculate sum total of column in jquery#calculate sum total of column in jquery how to calculate total in jquery#how to calculate sum of column values in javascript#how to calculate total in jquery#how to dynamically add row and calculate sum using jquery#javascript sum table row values#jquery datatable sum of particular column#update column value in html table using jquery#using jquery to perform calculations in a table
0 notes
Text
Years ago, I was working part-time in a government organization. My job was mostly helping with processing (printed out) forms, but because I happen to be good with IT, I also got roped into updating the templates for those forms when necessary.
Because the recommended process was filling them out on the computer and printing them out (why? government agency), the logical next feature requested was performing some validation on the data. Like "if you put a checkmark in the 'yes' box in section 1, you also have to fill out sections 2 to 4", or "if a number is entered on a new row in this column, recalculate the sum of the column in another field", that kind of thing.
This was programmed in an ancient - and even by then deprecated - software by Adobe, which allowed you to attach a piece of Javascript to each form field in a PDF, and to read values of other form fields. It stored the displayed value of the form field in a property, so you would do stuff like "pages[1].sections[3].checkboxes[1].value" to access it.
I eventually ran into a bug where I could not access the displayed value of other fields (I think it would return empty string no matter what), but only if it had been modified by Javascript. But only sometimes.
Off to Google I go. Google says: "You are looking for a weird bug in software that was invented almost before the WWW and that Adobe made sure to kill the support forum for when the successor in that product line was released? Hah. Hah. Hahaha. Get a load of this guy."
What I eventually found out through several days worth of trial and error was this:
As a feature for the less technical users, the actually stored value of the Javascript object was not, technically, whatever you assigned to this.value - it was the result of whichever assignment you ran last in that piece of code.
Which is fine as long as all you are changing with Javascript is that particular field.
Remember the part where I said "recalculate other stuff based on the value of this cell"? Yeah. When the result of the last assignment was unrelated, it would throw away whatever was in this.value. Unless the last thing you did was somehow "touching" the this.value property.
This is how I ended up writing the most inanely necessary and philosophical statement in a piece of code I have ever written. From then on, every calculation in that PDF ended with
this.value = this.value
Here’s one for the programmers: what’s the most asinine bug you’ve ever encountered?
For my part, one of my top picks involved a system whereby each entry in a database had an associated file attachment. Presumably to keep the indices down to a manageable size, those attachments were distributed among 100 separate cloud storage buckets, with handles ranging from “0″ to “99″.
The system would decide which bucket to store each file attachment in by taking the last two digits of the associated database entry’s ID; for example, entry ID 12345 would store its file attachment in bucket “45″. It’s at least vaguely reasonable to assume that the terminal digits of the entry ID will be evenly distributed, so nothing too objectionable going on there.
The problem: when external applications tried to reconstruct the path to the file attachment based on the entry’s ID - which should have been 100% reliable - they were getting a nonexistent filename about 10% of the time. When I was brought on board, what I was told is that the system was intermittently failing to store the file attachment, possibly due to API issues with the cloud storage.
This is not what was happening.
You likely noticed that the bucket’s handles range from “0″ to “99″ and not “00″ to “99″, right? i.e., the first ten buckets only have one character in their handles while all of the rest have two. There are any number of ways to perform that mapping, but how the genius who built this thing decided to do it was to treat each entry ID as a string, chop off the last two characters, then take the resulting two-character string and replace the “0″ characters with zero-length strings.
(Some of you are probably way ahead of me here.)
For example, an entry with an ID of, say, 3201 would be translated as 3201 > “3201″ > “01″ > “1″, which is exactly the bucket handle you’d expect.
The trouble is, an entry with an ID of, say, 3210 would also be translated as 3210 > “3210″ > “10″ > “1″ - i.e., the attachment would be stored in bucket “1″, not the expected destination of bucket “10″.
Basically, what was happening is that the buckets with single-character handles were ending up with twice as many files as they should have, and buckets “10″, “20″, “30″, etc. were totally empty. Nobody who’d worked on the issue prior to me had noticed; whenever creating a database entry “failed”, they’d just try again, and since that resulted in the entry ID being incremented, they’d usually end up an entry whose file attachment was in the expected bucket on the second try, creating the false impression that the problem was intermittent.
But wait - there’s more!
What about entries with IDs like, say, 3200? That translates as 3200 > “3200″ > “00″ > ““ - i.e., when the system tried to determine the handle of the bucket to store that entry’s file attachment in, it would come up with a zero-length string. That’s gotta be a fatal error, right?
Nope! Apparently, someone - possibly the system’s original author - had spotted the issue, but rather than fixing it, they’d simply prevented the fatal error by creating a cloud storage bucket whose handle was a zero-length string. The resulting bucket didn’t show up in the list of available buckets, so nobody knew it was there, but it was somehow still a valid target for all other API calls, so the system was quite happily shoving the file attachment for every database entry whose ID ended in 00 into a bucket nobody knew existed.
And the kicker? The system didn’t use strong type checking, so this workaround wasn’t just masking the issue with entry IDs ending in 00. It was also masking every other situation where a typo in the code was causing a null or uninitialised bucket handle to be passed to the API. The null variable would be implicitly typecast to a zero-length string, and there was in fact a bucket whose handle was a zero-length string, so away we go!
So that’s my story. What’s yours?
672 notes
·
View notes
Photo
Pokemon evolution(ary algorithm)!
For some details on the algorithm and implementation, see below ^^
Once upon a time I saw some images generated with basic shapes that tried to approximate some given image. Because I like the style, I made something like that myself. Problem was: There aren’t many good libraries to perform drawing shapes that also handled transparency and per-pixel operations. Coding all of that myself was too much work, so I used the easiest solution: JavaScript. Problem was... JavaScript is slow. Drawing a few hundred triangles and then comparing the resulting image to another given image was a lot of work, even using multiple workers to distribute it. I got maybe 5-10 images per second that way. The algorithm I use is pretty simple but slow... so it took hours for at least some reasonable results.
A few days ago I decided to revisit this thing with the help of my trusty GPU. Basically I redid everything so it runs completely on the GPU. I now have about 600 images per second!
Here you can see some of my most liked pokemon :)
Following are some more details on how it works:
The algorithm: Take a set of triangles. Each consisting of three 2D vertices and a color with transparency. Now you draw them all. Now you compare the resulting image with the one you want to approximate. This can be done for example by comparing each pixel and adding the squared differences of each color channel (squaring makes everything positive, which is good, and also is a nice and smooth function). If the result is better than your current best, use this new one as your new best. Now you go over your triangles and with some probability change their vertices or colors. Then repeat from the drawing phase. This is a simple hill climbing algorithm. If you are more precise, this isn’t really a evolutionary/genetic algorithm, as there isn’t a population with stuff like crossover happening. But being a bit more informal here (and for the title to make sense :D) I will still call it evolutionary. Just think of it as having a single individual reproducing. Only the best will survive and live as long as it stays the best, outliving all its children. Each generation mutates slightly and has to adapt to the environment. So I would say the naming is at least a bit justified.
The implementation: If you want to do something like this, here is the general recipe. You generally want to avoid transferring data from your CPU to the GPU and back, as copying stuff needlessly is not good. My implementation uses OpenGL, but you may choose any other API, as long as it provides compute shaders and writeable buffers (you could of course emulate that with textures or other techniques, but that would require some more thinking). You put all your triangle data in a buffer. I use one vec2 buffer for the positions (length = triangles*3) and one vec4 buffer for colors (length = triangles). Those buffers are bound as a shader storage buffer object (SSBO) and can be accessed by all shader stages. One Compute shader updates the triangles. Each work item processes one triangle. The only important thing here is, that you have to implement your own pseudo-random number generator. Probably the best way is to upload some initial random seeds and then use a simple linear-congruental generator (https://en.wikipedia.org/wiki/Linear_congruential_generator). I use a more involved Tau-step generator, which is probably overkill, but I had it l lying around (used it in monte-carlo simulations, where the randomness is a bit more important).
The drawing phase can be very simple. Just bind your SSBOs as vertex buffer objects and draw with triangles mode. Or the more complicated (which I did for some reason, probably because of some other code I used elsewhere...): Draw n=triangles instanced points. Use a geometry shader to generate a triangle for each point from the buffers. Then just render normally. Will probably change that and hope to see some improvements. The result is rendered in a frame buffer object with attached color texture.
The next step is just some screen quad/triangle shader comparing each pixel of the input image with the rendered one and calculating the difference. This is again put in a frame buffer object texture.
The important part is the error calculation. If you use some kind of sum of values, which is probably reasonable, you have the problem of it being more of a sequential algorithm. Luckily there is a parallel sum algorithm (for example here https://developer.nvidia.com/gpugems/GPUGems3/gpugems3_ch39.html). If you render the drawing in a frame buffer object, you can bind the output texture as an image and use it for read/write operations. Thinking about the image being just rows (or columns) layed out sequentially, it is easy to transfer the 2D image sum problem to a 1D one corresponding to the algorithm.
A simple shader with only one dispatch item just copies the result into another SSBO, where the current maximum is also stored.
Next compute shader goes over all triangles. If old error in the SSBO is larger than the current one, this will just copy the contents of the current triangle buffer to the “best” buffer.
Only copy to CPU and back operation I do is to download the error and draw the image on the screen and for some error difference output it as a file. Then upload the buffer with the updated current best error.
And that was basically all, in short form of course. You can of course do a lot better. Much to optimize, maybe use a real evolutionary algorithm, etc... But this was more or less a short two evening project, so it’s alright for now ^^
#pokemon#pikachu#charizard#algorithm#art#computer art#evolutionary algorithm#computer science#gpgpu#graphics card#opengl#compute shader
4 notes
·
View notes
Text
The Mindfulness of a Manual Performance Audit
As product owners or developers, we probably have a good handle on which core assets we need to make a website work. But rarely is that the whole picture. How well do we know every last thing that loads on our sites?
An occasional web performance audit, done by hand, does make us aware of every last thing. What’s so great about that? Well, for starters, the process increases our mindfulness of what we are actually asking of our users. Furthermore, a bit of spreadsheet wizardry lets us shape our findings in a way that has more meaning for stakeholders. It allows us to speak to our web performance in terms of purpose, like so:
Want to be able to make something like that? Follow along.
Wait, don’t we have computers for this sort of thing?
A manual audit may seem like pointless drudgery. Why do this by hand? Can’t we automate this somehow?
That’s the whole point. We want to achieve mindfulness—not automate everything away. When we take the time to consider each and every thing that loads on a page, we get a truer picture of our work.
It takes a human mind to look at every asset on a page and assign it a purpose. This in turn allows us to shape our data in such a way that it means something to people who don’t know what acronyms like CSS or WOFF mean. Besides, who doesn’t like a nice pie chart?
Here’s the process, step by step:
Get your performance data in a malleable format.
Extract the information necessary.
Go item by item, assigning each asset request a purpose.
Calculate totals, and modify data into easily understood units.
Make fancy pie charts.
The audit may take half an hour to an hour the first time you do it this way, but with practice you’ll be able to do it in a few minutes. Let’s go!
Gathering your performance data
To get started, figure out what URL you want to evaluate. Look at your analytics and try to determine which page type is your most popular. Don’t just default to your home page. For instance, if you have a news site, articles are probably your most popular page type. If you’re analyzing a single-page app, determine what the most commonly accessed view is.
You need to get your network activity at that URL into a CSV/spreadsheet format. In my experience, the easiest way to do this is to use WebPagetest, whose premise is simple: give it a URL, and it will do an assessment that tries to measure perceived performance.
Head over to WebPagetest and pop your URL in the big field on the homepage. However, before running the test, open the Advanced Settings panel. Make sure you’re only running one test, and set Repeat View to First View Only. This will ensure that you don’t have duplicate requests in your data. Now, let the test run—hit the big “Start Test” button.
Once you have a results page, click the link in the top right corner that says “Raw object data”.
A CSV file will download with your network requests set out in a spreadsheet that you can manipulate.
Navigating & scrubbing the data
Now, open the CSV file in your favorite spreadsheet editor: Excel, Numbers, or (my personal favorite) Google Sheets. The rest of this article will be written with Google Sheets in mind, though a similar result is certainly possible with other spreadsheet programs.
At first it will probably seem like this file contains an unwieldy amount of information, but we’re only interested in a small amount of this data. These are the three columns we care about:
Host (column F)
URL (column G)
Object Size (column N)
The other columns you can just ignore, hide, or delete. Or even better: select those three columns, copy them, and paste them into a new spreadsheet.
Auditing each asset request
With your pared-down spreadsheet, insert a new first column and label it “Purpose”. You can also include a Description/Comment column, if you wish.
Next, go down each row, line by line, and assign each asset request a purpose. I suggest something like the following:
Content (e.g., the core HTML document, images, media—the stuff users care about)
Function (e.g., functional JavaScript files that you have authored, CSS, webfonts)
Analytics (e.g., Google Analytics, New Relic, etc.)
Ads (e.g., Google DFP, any ad networks, etc.)
Your Purpose names can be whatever you want. What matters is that your labels for each purpose are consistent—capitalization and all. They need to group neatly in order to generate the fancy charts later. (Pro tip: use data validation on this column to ensure consistency in your spreadsheet.)
So how do you determine the purpose? Typically, the biggest clue is the “Host” column. You will, very quickly, start to recognize which hosts provide what. Your root URL will be where your document comes from, but you will also find:
CDN URLs like cloudfront.net, or cloudflare.com. Sometimes these have images (which are typically content); sometimes they host CSS or JavaScript files (functionality).
Analytics URLs like googletagservices.com, googletagmanager.com, google-analytics.com, or js-agent.newrelic.com.
Ad URLs like doubleclick.net or googlesyndication.com.
If you’re ever unsure of a URL, either try it out yourself in your browser, or literally google the URL. (Hint: if you don’t recognize the URL right away, it’s most likely ad-related.)
Mindfulness
Just doing the steps above will likely be eye-opening for you. Stopping to consider each asset on a page, and why it’s there, will help you be mindful of every single thing the page loads.
You may be in for some surprises the first time you do this. A few unexpected items might turn up. A script might be loaded more than once. That social widget might be a huge page weight. Requests coming from ads might be more numerous than you thought. That’s why I suggested a Description/Comment column—you can make notes there like “WTF?” and “Can we remove this?”
Augmenting your data
Before you can generate fancy pie charts, you’ll need to do a little more spreadsheet wrangling. Forewarned is forearmed—extreme spreadsheet nerdery lies ahead.
First, you need to translate the request sizes to kilobytes (KB), because they are initially supplied in bytes, and no human speaks in terms of bytes. Next to the column “Object Size,” insert another column called “Object Size (KB).” Then enter a formula in the first cell, something like this:
=E2/1000
Translation: you’re simply dividing the amount in the cell from the previous column (E2, in this case) by 1000. You can highlight this new cell, then drag the corner down the entire column to do the same for each row.
Totaling requests
Now, to figure out how many HTTP requests are related to each Purpose, you need to do a special kind of count. Insert two more columns, one labeled “Purpose Labels” and the second “Purpose Reqs.” Under Purpose Labels, in the first row, enter this formula:
=SORT(UNIQUE(B2:B),1,TRUE)
This assumes that your purpose assessment is column B. If it’s not, swap out the “B” in this example for your column name. This formula will go down column B and output a result if it’s unique. You only need to enter this in the first cell of the column. This is one reason why having consistency in the Purpose column is important.
Now, under the second column you made (Purpose Reqs) in the first cell, enter this formula:
=ARRAYFORMULA(COUNTIF(B2:B,G2:G))
This formula will also go down column B, and do a count if it matches with something in column G (assuming column G is your Purpose Labels column). This is the easiest way to total how many HTTP requests fall into each purpose.
Totaling download size by purpose
Finally, you can now also total the data (in KB) for each purpose. Insert one more column and call it Purpose Download Size. In the first cell, insert the following formula:
=SUM(FILTER($F$2:F,$B$2:B=G2))
This will total the data size in column F if its purpose in column B matches G2 (i.e., your first Purpose Label from the section above). In contrast to the last two formulas, you’ll need to copy this formula and modify it for each row, making the last part (“G2”) match the row it’s on. In this case, the next one would end in “G3”.
Make with the fancy charts
With your assets grouped by purpose, data translated to KB, number of requests counted, and download size totaled, it will be pretty easy to generate some charts.
The HTTP request chart
To make an HTTP request chart, select the columns Purpose Label and Purpose Reqs (columns G and H in my example), and then go to Insert > Chart. Scroll down the list of possible charts, and choose a pie chart. Be sure to check the box marked “Use column G as labels.”
Under the “Customization” tab, edit the Title to say “HTTP Requests”; under “Slice,” be sure “Value” is selected (the default is “Percentage”). We do this because the number of requests is what you want to convey here.
Go ahead—tweak the colors to your liking. And ditch Arial while you’re at it.
Download-size chart
The download-size-by-purpose pie chart is very similar. Select the columns Purpose Label and Purpose Download Size (columns G & I in my example); then go to Insert > Chart. Scroll down the list of possible charts and choose a pie chart. Be sure to check the box marked “Use column G as labels”.
Under the “Customization” tab, edit the Title to say “Download Size”; under “Slice,” be sure “Value” is selected as well. We do this so we can indicate the total KB for each purpose.
Or, you can grab a ready-made template. If you want to see a completed assessment, check out the one I did on an A List Apart article. I’ve also made a blank file with a lot of the trickier spreadsheet stuff already done. Feel free to go to File > Make a Copy so you can play around with it. You just need to get your page data from WebPagetest and paste in the three columns. After that, you can start your line-by-line assessment.
Telling the good from the bad
If you show your data to a stakeholder, they may be surprised by how much page weight goes to things like ads or analytics. On the other hand, they might respond by asking what we should be aiming for. That question is a little harder to answer.
Some benchmarks get bandied about—1 MB or less, a WebPagetest Score of 1000, a Google PageSpeed score of over 90, and so on. But those are very arbitrary parameters and, depending on your project, unattainable ideals.
My suggestion? Do an assessment like this on your competitors. If you can come back to your stakeholders and show how two or three competitors stack up, and show them what you’re doing, that will go much further in championing performance.
Remember that performance is never “done”—it can only improve. What might help your organization is doing assessments like this over time and presenting page performance as an ongoing series of bar charts. With a little effort (and luck), you should be able to demonstrate that the things your organization cares about are continually improving. If not, it will present a much more compelling case for why things need to change for the better.
So you have some pretty charts. Now what?
Your charts’ usefulness will vary according to the precise business needs and politics of your organization.
For instance, let’s say you’re a developer, and a project manager asks you to add yet another ad-metrics script to your site. After completing an assessment like the one above, you might be able to come back and say, “Ads already constitute 40 percent of our page weight. Do you really want to pile on more?”
Because you’ve ascribed purpose to your asset requests, you’ll be able to offer data like that. I once worked with a project manager who started pushing back on such requests because I was able to give them easy-to-understand data of this sort. I’m not saying it will always turn out this way, but you need to give decision makers information they can grasp.
Remember, too, that you are in charge of the Purpose column. You can make up any purpose you want. Interested in the impact that movie files have on your site relative to everything else? Make one of your purposes “Movies.” Want to call out framework files versus files you personally author? Go for it!
I hope that this article has made you want to consider, and reconsider, each and every thing you download on a given page. Each and every request. And, in the process of doing this, I hope you are equipped to call out by purpose every item you ask your users to download. That will allow you to talk with your stakeholders in a way that they understand, and will help you make the case for better performance choices.
Further reading:
My publicly available spreadsheet template
A sample completed assessment of an A List Apart article page
Google Sheets documentation on working with charts
WebPagetest documentation
http://ift.tt/2qxHUYr
0 notes
Text
The Mindfulness of a Manual Performance Audit
As product owners or developers, we probably have a good handle on which core assets we need to make a website work. But rarely is that the whole picture. How well do we know every last thing that loads on our sites?
An occasional web performance audit, done by hand, does make us aware of every last thing. What’s so great about that? Well, for starters, the process increases our mindfulness of what we are actually asking of our users. Furthermore, a bit of spreadsheet wizardry lets us shape our findings in a way that has more meaning for stakeholders. It allows us to speak to our web performance in terms of purpose, like so:
Want to be able to make something like that? Follow along.
Wait, don’t we have computers for this sort of thing?
A manual audit may seem like pointless drudgery. Why do this by hand? Can’t we automate this somehow?
That’s the whole point. We want to achieve mindfulness—not automate everything away. When we take the time to consider each and every thing that loads on a page, we get a truer picture of our work.
It takes a human mind to look at every asset on a page and assign it a purpose. This in turn allows us to shape our data in such a way that it means something to people who don’t know what acronyms like CSS or WOFF mean. Besides, who doesn’t like a nice pie chart?
Here’s the process, step by step:
Get your performance data in a malleable format.
Extract the information necessary.
Go item by item, assigning each asset request a purpose.
Calculate totals, and modify data into easily understood units.
Make fancy pie charts.
The audit may take half an hour to an hour the first time you do it this way, but with practice you’ll be able to do it in a few minutes. Let’s go!
Gathering your performance data
To get started, figure out what URL you want to evaluate. Look at your analytics and try to determine which page type is your most popular. Don’t just default to your home page. For instance, if you have a news site, articles are probably your most popular page type. If you’re analyzing a single-page app, determine what the most commonly accessed view is.
You need to get your network activity at that URL into a CSV/spreadsheet format. In my experience, the easiest way to do this is to use WebPagetest, whose premise is simple: give it a URL, and it will do an assessment that tries to measure perceived performance.
Head over to WebPagetest and pop your URL in the big field on the homepage. However, before running the test, open the Advanced Settings panel. Make sure you’re only running one test, and set Repeat View to First View Only. This will ensure that you don’t have duplicate requests in your data. Now, let the test run—hit the big “Start Test” button.
Once you have a results page, click the link in the top right corner that says “Raw object data”.
A CSV file will download with your network requests set out in a spreadsheet that you can manipulate.
Navigating & scrubbing the data
Now, open the CSV file in your favorite spreadsheet editor: Excel, Numbers, or (my personal favorite) Google Sheets. The rest of this article will be written with Google Sheets in mind, though a similar result is certainly possible with other spreadsheet programs.
At first it will probably seem like this file contains an unwieldy amount of information, but we’re only interested in a small amount of this data. These are the three columns we care about:
Host (column F)
URL (column G)
Object Size (column N)
The other columns you can just ignore, hide, or delete. Or even better: select those three columns, copy them, and paste them into a new spreadsheet.
Auditing each asset request
With your pared-down spreadsheet, insert a new first column and label it “Purpose”. You can also include a Description/Comment column, if you wish.
Next, go down each row, line by line, and assign each asset request a purpose. I suggest something like the following:
Content (e.g., the core HTML document, images, media—the stuff users care about)
Function (e.g., functional JavaScript files that you have authored, CSS, webfonts)
Analytics (e.g., Google Analytics, New Relic, etc.)
Ads (e.g., Google DFP, any ad networks, etc.)
Your Purpose names can be whatever you want. What matters is that your labels for each purpose are consistent—capitalization and all. They need to group neatly in order to generate the fancy charts later. (Pro tip: use data validation on this column to ensure consistency in your spreadsheet.)
So how do you determine the purpose? Typically, the biggest clue is the “Host” column. You will, very quickly, start to recognize which hosts provide what. Your root URL will be where your document comes from, but you will also find:
CDN URLs like cloudfront.net, or cloudflare.com. Sometimes these have images (which are typically content); sometimes they host CSS or JavaScript files (functionality).
Analytics URLs like googletagservices.com, googletagmanager.com, google-analytics.com, or js-agent.newrelic.com.
Ad URLs like doubleclick.net or googlesyndication.com.
If you’re ever unsure of a URL, either try it out yourself in your browser, or literally google the URL. (Hint: if you don’t recognize the URL right away, it’s most likely ad-related.)
Mindfulness
Just doing the steps above will likely be eye-opening for you. Stopping to consider each asset on a page, and why it’s there, will help you be mindful of every single thing the page loads.
You may be in for some surprises the first time you do this. A few unexpected items might turn up. A script might be loaded more than once. That social widget might be a huge page weight. Requests coming from ads might be more numerous than you thought. That’s why I suggested a Description/Comment column—you can make notes there like “WTF?” and “Can we remove this?”
Augmenting your data
Before you can generate fancy pie charts, you’ll need to do a little more spreadsheet wrangling. Forewarned is forearmed—extreme spreadsheet nerdery lies ahead.
First, you need to translate the request sizes to kilobytes (KB), because they are initially supplied in bytes, and no human speaks in terms of bytes. Next to the column “Object Size,” insert another column called “Object Size (KB).” Then enter a formula in the first cell, something like this:
=E2/1000
Translation: you’re simply dividing the amount in the cell from the previous column (E2, in this case) by 1000. You can highlight this new cell, then drag the corner down the entire column to do the same for each row.
Totaling requests
Now, to figure out how many HTTP requests are related to each Purpose, you need to do a special kind of count. Insert two more columns, one labeled “Purpose Labels” and the second “Purpose Reqs.” Under Purpose Labels, in the first row, enter this formula:
=SORT(UNIQUE(B2:B),1,TRUE)
This assumes that your purpose assessment is column B. If it’s not, swap out the “B” in this example for your column name. This formula will go down column B and output a result if it’s unique. You only need to enter this in the first cell of the column. This is one reason why having consistency in the Purpose column is important.
Now, under the second column you made (Purpose Reqs) in the first cell, enter this formula:
=ARRAYFORMULA(COUNTIF(B2:B,G2:G))
This formula will also go down column B, and do a count if it matches with something in column G (assuming column G is your Purpose Labels column). This is the easiest way to total how many HTTP requests fall into each purpose.
Totaling download size by purpose
Finally, you can now also total the data (in KB) for each purpose. Insert one more column and call it Purpose Download Size. In the first cell, insert the following formula:
=SUM(FILTER($F$2:F,$B$2:B=G2))
This will total the data size in column F if its purpose in column B matches G2 (i.e., your first Purpose Label from the section above). In contrast to the last two formulas, you’ll need to copy this formula and modify it for each row, making the last part (“G2”) match the row it’s on. In this case, the next one would end in “G3”.
Make with the fancy charts
With your assets grouped by purpose, data translated to KB, number of requests counted, and download size totaled, it will be pretty easy to generate some charts.
The HTTP request chart
To make an HTTP request chart, select the columns Purpose Label and Purpose Reqs (columns G and H in my example), and then go to Insert > Chart. Scroll down the list of possible charts, and choose a pie chart. Be sure to check the box marked “Use column G as labels.”
Under the “Customization” tab, edit the Title to say “HTTP Requests”; under “Slice,” be sure “Value” is selected (the default is “Percentage”). We do this because the number of requests is what you want to convey here.
Go ahead—tweak the colors to your liking. And ditch Arial while you’re at it.
Download-size chart
The download-size-by-purpose pie chart is very similar. Select the columns Purpose Label and Purpose Download Size (columns G & I in my example); then go to Insert > Chart. Scroll down the list of possible charts and choose a pie chart. Be sure to check the box marked “Use column G as labels”.
Under the “Customization” tab, edit the Title to say “Download Size”; under “Slice,” be sure “Value” is selected as well. We do this so we can indicate the total KB for each purpose.
Or, you can grab a ready-made template. If you want to see a completed assessment, check out the one I did on an A List Apart article. I’ve also made a blank file with a lot of the trickier spreadsheet stuff already done. Feel free to go to File > Make a Copy so you can play around with it. You just need to get your page data from WebPagetest and paste in the three columns. After that, you can start your line-by-line assessment.
Telling the good from the bad
If you show your data to a stakeholder, they may be surprised by how much page weight goes to things like ads or analytics. On the other hand, they might respond by asking what we should be aiming for. That question is a little harder to answer.
Some benchmarks get bandied about—1 MB or less, a WebPagetest Score of 1000, a Google PageSpeed score of over 90, and so on. But those are very arbitrary parameters and, depending on your project, unattainable ideals.
My suggestion? Do an assessment like this on your competitors. If you can come back to your stakeholders and show how two or three competitors stack up, and show them what you’re doing, that will go much further in championing performance.
Remember that performance is never “done”—it can only improve. What might help your organization is doing assessments like this over time and presenting page performance as an ongoing series of bar charts. With a little effort (and luck), you should be able to demonstrate that the things your organization cares about are continually improving. If not, it will present a much more compelling case for why things need to change for the better.
So you have some pretty charts. Now what?
Your charts’ usefulness will vary according to the precise business needs and politics of your organization.
For instance, let’s say you’re a developer, and a project manager asks you to add yet another ad-metrics script to your site. After completing an assessment like the one above, you might be able to come back and say, “Ads already constitute 40 percent of our page weight. Do you really want to pile on more?”
Because you’ve ascribed purpose to your asset requests, you’ll be able to offer data like that. I once worked with a project manager who started pushing back on such requests because I was able to give them easy-to-understand data of this sort. I’m not saying it will always turn out this way, but you need to give decision makers information they can grasp.
Remember, too, that you are in charge of the Purpose column. You can make up any purpose you want. Interested in the impact that movie files have on your site relative to everything else? Make one of your purposes “Movies.” Want to call out framework files versus files you personally author? Go for it!
I hope that this article has made you want to consider, and reconsider, each and every thing you download on a given page. Each and every request. And, in the process of doing this, I hope you are equipped to call out by purpose every item you ask your users to download. That will allow you to talk with your stakeholders in a way that they understand, and will help you make the case for better performance choices.
Further reading:
My publicly available spreadsheet template
A sample completed assessment of an A List Apart article page
Google Sheets documentation on working with charts
WebPagetest documentation
http://ift.tt/2qxHUYr
0 notes
Text
The Mindfulness of a Manual Performance Audit
As product owners or developers, we probably have a good handle on which core assets we need to make a website work. But rarely is that the whole picture. How well do we know every last thing that loads on our sites?
An occasional web performance audit, done by hand, does make us aware of every last thing. What’s so great about that? Well, for starters, the process increases our mindfulness of what we are actually asking of our users. Furthermore, a bit of spreadsheet wizardry lets us shape our findings in a way that has more meaning for stakeholders. It allows us to speak to our web performance in terms of purpose, like so:
Want to be able to make something like that? Follow along.
Wait, don’t we have computers for this sort of thing?
A manual audit may seem like pointless drudgery. Why do this by hand? Can’t we automate this somehow?
That’s the whole point. We want to achieve mindfulness—not automate everything away. When we take the time to consider each and every thing that loads on a page, we get a truer picture of our work.
It takes a human mind to look at every asset on a page and assign it a purpose. This in turn allows us to shape our data in such a way that it means something to people who don’t know what acronyms like CSS or WOFF mean. Besides, who doesn’t like a nice pie chart?
Here’s the process, step by step:
Get your performance data in a malleable format.
Extract the information necessary.
Go item by item, assigning each asset request a purpose.
Calculate totals, and modify data into easily understood units.
Make fancy pie charts.
The audit may take half an hour to an hour the first time you do it this way, but with practice you’ll be able to do it in a few minutes. Let’s go!
Gathering your performance data
To get started, figure out what URL you want to evaluate. Look at your analytics and try to determine which page type is your most popular. Don’t just default to your home page. For instance, if you have a news site, articles are probably your most popular page type. If you’re analyzing a single-page app, determine what the most commonly accessed view is.
You need to get your network activity at that URL into a CSV/spreadsheet format. In my experience, the easiest way to do this is to use WebPagetest, whose premise is simple: give it a URL, and it will do an assessment that tries to measure perceived performance.
Head over to WebPagetest and pop your URL in the big field on the homepage. However, before running the test, open the Advanced Settings panel. Make sure you’re only running one test, and set Repeat View to First View Only. This will ensure that you don’t have duplicate requests in your data. Now, let the test run—hit the big “Start Test” button.
Once you have a results page, click the link in the top right corner that says “Raw object data”.
A CSV file will download with your network requests set out in a spreadsheet that you can manipulate.
Navigating & scrubbing the data
Now, open the CSV file in your favorite spreadsheet editor: Excel, Numbers, or (my personal favorite) Google Sheets. The rest of this article will be written with Google Sheets in mind, though a similar result is certainly possible with other spreadsheet programs.
At first it will probably seem like this file contains an unwieldy amount of information, but we’re only interested in a small amount of this data. These are the three columns we care about:
Host (column F)
URL (column G)
Object Size (column N)
The other columns you can just ignore, hide, or delete. Or even better: select those three columns, copy them, and paste them into a new spreadsheet.
Auditing each asset request
With your pared-down spreadsheet, insert a new first column and label it “Purpose”. You can also include a Description/Comment column, if you wish.
Next, go down each row, line by line, and assign each asset request a purpose. I suggest something like the following:
Content (e.g., the core HTML document, images, media—the stuff users care about)
Function (e.g., functional JavaScript files that you have authored, CSS, webfonts)
Analytics (e.g., Google Analytics, New Relic, etc.)
Ads (e.g., Google DFP, any ad networks, etc.)
Your Purpose names can be whatever you want. What matters is that your labels for each purpose are consistent—capitalization and all. They need to group neatly in order to generate the fancy charts later. (Pro tip: use data validation on this column to ensure consistency in your spreadsheet.)
So how do you determine the purpose? Typically, the biggest clue is the “Host” column. You will, very quickly, start to recognize which hosts provide what. Your root URL will be where your document comes from, but you will also find:
CDN URLs like cloudfront.net, or cloudflare.com. Sometimes these have images (which are typically content); sometimes they host CSS or JavaScript files (functionality).
Analytics URLs like googletagservices.com, googletagmanager.com, google-analytics.com, or js-agent.newrelic.com.
Ad URLs like doubleclick.net or googlesyndication.com.
If you’re ever unsure of a URL, either try it out yourself in your browser, or literally google the URL. (Hint: if you don’t recognize the URL right away, it’s most likely ad-related.)
Mindfulness
Just doing the steps above will likely be eye-opening for you. Stopping to consider each asset on a page, and why it’s there, will help you be mindful of every single thing the page loads.
You may be in for some surprises the first time you do this. A few unexpected items might turn up. A script might be loaded more than once. That social widget might be a huge page weight. Requests coming from ads might be more numerous than you thought. That’s why I suggested a Description/Comment column—you can make notes there like “WTF?” and “Can we remove this?”
Augmenting your data
Before you can generate fancy pie charts, you’ll need to do a little more spreadsheet wrangling. Forewarned is forearmed—extreme spreadsheet nerdery lies ahead.
First, you need to translate the request sizes to kilobytes (KB), because they are initially supplied in bytes, and no human speaks in terms of bytes. Next to the column “Object Size,” insert another column called “Object Size (KB).” Then enter a formula in the first cell, something like this:
=E2/1000
Translation: you’re simply dividing the amount in the cell from the previous column (E2, in this case) by 1000. You can highlight this new cell, then drag the corner down the entire column to do the same for each row.
Totaling requests
Now, to figure out how many HTTP requests are related to each Purpose, you need to do a special kind of count. Insert two more columns, one labeled “Purpose Labels” and the second “Purpose Reqs.” Under Purpose Labels, in the first row, enter this formula:
=SORT(UNIQUE(B2:B),1,TRUE)
This assumes that your purpose assessment is column B. If it’s not, swap out the “B” in this example for your column name. This formula will go down column B and output a result if it’s unique. You only need to enter this in the first cell of the column. This is one reason why having consistency in the Purpose column is important.
Now, under the second column you made (Purpose Reqs) in the first cell, enter this formula:
=ARRAYFORMULA(COUNTIF(B2:B,G2:G))
This formula will also go down column B, and do a count if it matches with something in column G (assuming column G is your Purpose Labels column). This is the easiest way to total how many HTTP requests fall into each purpose.
Totaling download size by purpose
Finally, you can now also total the data (in KB) for each purpose. Insert one more column and call it Purpose Download Size. In the first cell, insert the following formula:
=SUM(FILTER($F$2:F,$B$2:B=G2))
This will total the data size in column F if its purpose in column B matches G2 (i.e., your first Purpose Label from the section above). In contrast to the last two formulas, you’ll need to copy this formula and modify it for each row, making the last part (“G2”) match the row it’s on. In this case, the next one would end in “G3”.
Make with the fancy charts
With your assets grouped by purpose, data translated to KB, number of requests counted, and download size totaled, it will be pretty easy to generate some charts.
The HTTP request chart
To make an HTTP request chart, select the columns Purpose Label and Purpose Reqs (columns G and H in my example), and then go to Insert > Chart. Scroll down the list of possible charts, and choose a pie chart. Be sure to check the box marked “Use column G as labels.”
Under the “Customization” tab, edit the Title to say “HTTP Requests”; under “Slice,” be sure “Value” is selected (the default is “Percentage”). We do this because the number of requests is what you want to convey here.
Go ahead—tweak the colors to your liking. And ditch Arial while you’re at it.
Download-size chart
The download-size-by-purpose pie chart is very similar. Select the columns Purpose Label and Purpose Download Size (columns G & I in my example); then go to Insert > Chart. Scroll down the list of possible charts and choose a pie chart. Be sure to check the box marked “Use column G as labels”.
Under the “Customization” tab, edit the Title to say “Download Size”; under “Slice,” be sure “Value” is selected as well. We do this so we can indicate the total KB for each purpose.
Or, you can grab a ready-made template. If you want to see a completed assessment, check out the one I did on an A List Apart article. I’ve also made a blank file with a lot of the trickier spreadsheet stuff already done. Feel free to go to File > Make a Copy so you can play around with it. You just need to get your page data from WebPagetest and paste in the three columns. After that, you can start your line-by-line assessment.
Telling the good from the bad
If you show your data to a stakeholder, they may be surprised by how much page weight goes to things like ads or analytics. On the other hand, they might respond by asking what we should be aiming for. That question is a little harder to answer.
Some benchmarks get bandied about—1 MB or less, a WebPagetest Score of 1000, a Google PageSpeed score of over 90, and so on. But those are very arbitrary parameters and, depending on your project, unattainable ideals.
My suggestion? Do an assessment like this on your competitors. If you can come back to your stakeholders and show how two or three competitors stack up, and show them what you’re doing, that will go much further in championing performance.
Remember that performance is never “done”—it can only improve. What might help your organization is doing assessments like this over time and presenting page performance as an ongoing series of bar charts. With a little effort (and luck), you should be able to demonstrate that the things your organization cares about are continually improving. If not, it will present a much more compelling case for why things need to change for the better.
So you have some pretty charts. Now what?
Your charts’ usefulness will vary according to the precise business needs and politics of your organization.
For instance, let’s say you’re a developer, and a project manager asks you to add yet another ad-metrics script to your site. After completing an assessment like the one above, you might be able to come back and say, “Ads already constitute 40 percent of our page weight. Do you really want to pile on more?”
Because you’ve ascribed purpose to your asset requests, you’ll be able to offer data like that. I once worked with a project manager who started pushing back on such requests because I was able to give them easy-to-understand data of this sort. I’m not saying it will always turn out this way, but you need to give decision makers information they can grasp.
Remember, too, that you are in charge of the Purpose column. You can make up any purpose you want. Interested in the impact that movie files have on your site relative to everything else? Make one of your purposes “Movies.” Want to call out framework files versus files you personally author? Go for it!
I hope that this article has made you want to consider, and reconsider, each and every thing you download on a given page. Each and every request. And, in the process of doing this, I hope you are equipped to call out by purpose every item you ask your users to download. That will allow you to talk with your stakeholders in a way that they understand, and will help you make the case for better performance choices.
Further reading:
My publicly available spreadsheet template
A sample completed assessment of an A List Apart article page
Google Sheets documentation on working with charts
WebPagetest documentation
http://ift.tt/2qxHUYr
0 notes
Text
The Mindfulness of a Manual Performance Audit
As product owners or developers, we probably have a good handle on which core assets we need to make a website work. But rarely is that the whole picture. How well do we know every last thing that loads on our sites?
An occasional web performance audit, done by hand, does make us aware of every last thing. What’s so great about that? Well, for starters, the process increases our mindfulness of what we are actually asking of our users. Furthermore, a bit of spreadsheet wizardry lets us shape our findings in a way that has more meaning for stakeholders. It allows us to speak to our web performance in terms of purpose, like so:
Want to be able to make something like that? Follow along.
Wait, don’t we have computers for this sort of thing?
A manual audit may seem like pointless drudgery. Why do this by hand? Can’t we automate this somehow?
That’s the whole point. We want to achieve mindfulness—not automate everything away. When we take the time to consider each and every thing that loads on a page, we get a truer picture of our work.
It takes a human mind to look at every asset on a page and assign it a purpose. This in turn allows us to shape our data in such a way that it means something to people who don’t know what acronyms like CSS or WOFF mean. Besides, who doesn’t like a nice pie chart?
Here’s the process, step by step:
Get your performance data in a malleable format.
Extract the information necessary.
Go item by item, assigning each asset request a purpose.
Calculate totals, and modify data into easily understood units.
Make fancy pie charts.
The audit may take half an hour to an hour the first time you do it this way, but with practice you’ll be able to do it in a few minutes. Let’s go!
Gathering your performance data
To get started, figure out what URL you want to evaluate. Look at your analytics and try to determine which page type is your most popular. Don’t just default to your home page. For instance, if you have a news site, articles are probably your most popular page type. If you’re analyzing a single-page app, determine what the most commonly accessed view is.
You need to get your network activity at that URL into a CSV/spreadsheet format. In my experience, the easiest way to do this is to use WebPagetest, whose premise is simple: give it a URL, and it will do an assessment that tries to measure perceived performance.
Head over to WebPagetest and pop your URL in the big field on the homepage. However, before running the test, open the Advanced Settings panel. Make sure you’re only running one test, and set Repeat View to First View Only. This will ensure that you don’t have duplicate requests in your data. Now, let the test run—hit the big “Start Test” button.
Once you have a results page, click the link in the top right corner that says “Raw object data”.
A CSV file will download with your network requests set out in a spreadsheet that you can manipulate.
Navigating & scrubbing the data
Now, open the CSV file in your favorite spreadsheet editor: Excel, Numbers, or (my personal favorite) Google Sheets. The rest of this article will be written with Google Sheets in mind, though a similar result is certainly possible with other spreadsheet programs.
At first it will probably seem like this file contains an unwieldy amount of information, but we’re only interested in a small amount of this data. These are the three columns we care about:
Host (column F)
URL (column G)
Object Size (column N)
The other columns you can just ignore, hide, or delete. Or even better: select those three columns, copy them, and paste them into a new spreadsheet.
Auditing each asset request
With your pared-down spreadsheet, insert a new first column and label it “Purpose”. You can also include a Description/Comment column, if you wish.
Next, go down each row, line by line, and assign each asset request a purpose. I suggest something like the following:
Content (e.g., the core HTML document, images, media—the stuff users care about)
Function (e.g., functional JavaScript files that you have authored, CSS, webfonts)
Analytics (e.g., Google Analytics, New Relic, etc.)
Ads (e.g., Google DFP, any ad networks, etc.)
Your Purpose names can be whatever you want. What matters is that your labels for each purpose are consistent—capitalization and all. They need to group neatly in order to generate the fancy charts later. (Pro tip: use data validation on this column to ensure consistency in your spreadsheet.)
So how do you determine the purpose? Typically, the biggest clue is the “Host” column. You will, very quickly, start to recognize which hosts provide what. Your root URL will be where your document comes from, but you will also find:
CDN URLs like cloudfront.net, or cloudflare.com. Sometimes these have images (which are typically content); sometimes they host CSS or JavaScript files (functionality).
Analytics URLs like googletagservices.com, googletagmanager.com, google-analytics.com, or js-agent.newrelic.com.
Ad URLs like doubleclick.net or googlesyndication.com.
If you’re ever unsure of a URL, either try it out yourself in your browser, or literally google the URL. (Hint: if you don’t recognize the URL right away, it’s most likely ad-related.)
Mindfulness
Just doing the steps above will likely be eye-opening for you. Stopping to consider each asset on a page, and why it’s there, will help you be mindful of every single thing the page loads.
You may be in for some surprises the first time you do this. A few unexpected items might turn up. A script might be loaded more than once. That social widget might be a huge page weight. Requests coming from ads might be more numerous than you thought. That’s why I suggested a Description/Comment column—you can make notes there like “WTF?” and “Can we remove this?”
Augmenting your data
Before you can generate fancy pie charts, you’ll need to do a little more spreadsheet wrangling. Forewarned is forearmed—extreme spreadsheet nerdery lies ahead.
First, you need to translate the request sizes to kilobytes (KB), because they are initially supplied in bytes, and no human speaks in terms of bytes. Next to the column “Object Size,” insert another column called “Object Size (KB).” Then enter a formula in the first cell, something like this:
=E2/1000
Translation: you’re simply dividing the amount in the cell from the previous column (E2, in this case) by 1000. You can highlight this new cell, then drag the corner down the entire column to do the same for each row.
Totaling requests
Now, to figure out how many HTTP requests are related to each Purpose, you need to do a special kind of count. Insert two more columns, one labeled “Purpose Labels” and the second “Purpose Reqs.” Under Purpose Labels, in the first row, enter this formula:
=SORT(UNIQUE(B2:B),1,TRUE)
This assumes that your purpose assessment is column B. If it’s not, swap out the “B” in this example for your column name. This formula will go down column B and output a result if it’s unique. You only need to enter this in the first cell of the column. This is one reason why having consistency in the Purpose column is important.
Now, under the second column you made (Purpose Reqs) in the first cell, enter this formula:
=ARRAYFORMULA(COUNTIF(B2:B,G2:G))
This formula will also go down column B, and do a count if it matches with something in column G (assuming column G is your Purpose Labels column). This is the easiest way to total how many HTTP requests fall into each purpose.
Totaling download size by purpose
Finally, you can now also total the data (in KB) for each purpose. Insert one more column and call it Purpose Download Size. In the first cell, insert the following formula:
=SUM(FILTER($F$2:F,$B$2:B=G2))
This will total the data size in column F if its purpose in column B matches G2 (i.e., your first Purpose Label from the section above). In contrast to the last two formulas, you’ll need to copy this formula and modify it for each row, making the last part (“G2”) match the row it’s on. In this case, the next one would end in “G3”.
Make with the fancy charts
With your assets grouped by purpose, data translated to KB, number of requests counted, and download size totaled, it will be pretty easy to generate some charts.
The HTTP request chart
To make an HTTP request chart, select the columns Purpose Label and Purpose Reqs (columns G and H in my example), and then go to Insert > Chart. Scroll down the list of possible charts, and choose a pie chart. Be sure to check the box marked “Use column G as labels.”
Under the “Customization” tab, edit the Title to say “HTTP Requests”; under “Slice,” be sure “Value” is selected (the default is “Percentage”). We do this because the number of requests is what you want to convey here.
Go ahead—tweak the colors to your liking. And ditch Arial while you’re at it.
Download-size chart
The download-size-by-purpose pie chart is very similar. Select the columns Purpose Label and Purpose Download Size (columns G & I in my example); then go to Insert > Chart. Scroll down the list of possible charts and choose a pie chart. Be sure to check the box marked “Use column G as labels”.
Under the “Customization” tab, edit the Title to say “Download Size”; under “Slice,” be sure “Value” is selected as well. We do this so we can indicate the total KB for each purpose.
Or, you can grab a ready-made template. If you want to see a completed assessment, check out the one I did on an A List Apart article. I’ve also made a blank file with a lot of the trickier spreadsheet stuff already done. Feel free to go to File > Make a Copy so you can play around with it. You just need to get your page data from WebPagetest and paste in the three columns. After that, you can start your line-by-line assessment.
Telling the good from the bad
If you show your data to a stakeholder, they may be surprised by how much page weight goes to things like ads or analytics. On the other hand, they might respond by asking what we should be aiming for. That question is a little harder to answer.
Some benchmarks get bandied about—1 MB or less, a WebPagetest Score of 1000, a Google PageSpeed score of over 90, and so on. But those are very arbitrary parameters and, depending on your project, unattainable ideals.
My suggestion? Do an assessment like this on your competitors. If you can come back to your stakeholders and show how two or three competitors stack up, and show them what you’re doing, that will go much further in championing performance.
Remember that performance is never “done”—it can only improve. What might help your organization is doing assessments like this over time and presenting page performance as an ongoing series of bar charts. With a little effort (and luck), you should be able to demonstrate that the things your organization cares about are continually improving. If not, it will present a much more compelling case for why things need to change for the better.
So you have some pretty charts. Now what?
Your charts’ usefulness will vary according to the precise business needs and politics of your organization.
For instance, let’s say you’re a developer, and a project manager asks you to add yet another ad-metrics script to your site. After completing an assessment like the one above, you might be able to come back and say, “Ads already constitute 40 percent of our page weight. Do you really want to pile on more?”
Because you’ve ascribed purpose to your asset requests, you’ll be able to offer data like that. I once worked with a project manager who started pushing back on such requests because I was able to give them easy-to-understand data of this sort. I’m not saying it will always turn out this way, but you need to give decision makers information they can grasp.
Remember, too, that you are in charge of the Purpose column. You can make up any purpose you want. Interested in the impact that movie files have on your site relative to everything else? Make one of your purposes “Movies.” Want to call out framework files versus files you personally author? Go for it!
I hope that this article has made you want to consider, and reconsider, each and every thing you download on a given page. Each and every request. And, in the process of doing this, I hope you are equipped to call out by purpose every item you ask your users to download. That will allow you to talk with your stakeholders in a way that they understand, and will help you make the case for better performance choices.
Further reading:
My publicly available spreadsheet template
A sample completed assessment of an A List Apart article page
Google Sheets documentation on working with charts
WebPagetest documentation
http://ift.tt/2qxHUYr
0 notes
Text
The Mindfulness of a Manual Performance Audit
As product owners or developers, we probably have a good handle on which core assets we need to make a website work. But rarely is that the whole picture. How well do we know every last thing that loads on our sites?
An occasional web performance audit, done by hand, does make us aware of every last thing. What’s so great about that? Well, for starters, the process increases our mindfulness of what we are actually asking of our users. Furthermore, a bit of spreadsheet wizardry lets us shape our findings in a way that has more meaning for stakeholders. It allows us to speak to our web performance in terms of purpose, like so:
Want to be able to make something like that? Follow along.
Wait, don’t we have computers for this sort of thing?
A manual audit may seem like pointless drudgery. Why do this by hand? Can’t we automate this somehow?
That’s the whole point. We want to achieve mindfulness—not automate everything away. When we take the time to consider each and every thing that loads on a page, we get a truer picture of our work.
It takes a human mind to look at every asset on a page and assign it a purpose. This in turn allows us to shape our data in such a way that it means something to people who don’t know what acronyms like CSS or WOFF mean. Besides, who doesn’t like a nice pie chart?
Here’s the process, step by step:
Get your performance data in a malleable format.
Extract the information necessary.
Go item by item, assigning each asset request a purpose.
Calculate totals, and modify data into easily understood units.
Make fancy pie charts.
The audit may take half an hour to an hour the first time you do it this way, but with practice you’ll be able to do it in a few minutes. Let’s go!
Gathering your performance data
To get started, figure out what URL you want to evaluate. Look at your analytics and try to determine which page type is your most popular. Don’t just default to your home page. For instance, if you have a news site, articles are probably your most popular page type. If you’re analyzing a single-page app, determine what the most commonly accessed view is.
You need to get your network activity at that URL into a CSV/spreadsheet format. In my experience, the easiest way to do this is to use WebPagetest, whose premise is simple: give it a URL, and it will do an assessment that tries to measure perceived performance.
Head over to WebPagetest and pop your URL in the big field on the homepage. However, before running the test, open the Advanced Settings panel. Make sure you’re only running one test, and set Repeat View to First View Only. This will ensure that you don’t have duplicate requests in your data. Now, let the test run—hit the big “Start Test” button.
Once you have a results page, click the link in the top right corner that says “Raw object data”.
A CSV file will download with your network requests set out in a spreadsheet that you can manipulate.
Navigating & scrubbing the data
Now, open the CSV file in your favorite spreadsheet editor: Excel, Numbers, or (my personal favorite) Google Sheets. The rest of this article will be written with Google Sheets in mind, though a similar result is certainly possible with other spreadsheet programs.
At first it will probably seem like this file contains an unwieldy amount of information, but we’re only interested in a small amount of this data. These are the three columns we care about:
Host (column F)
URL (column G)
Object Size (column N)
The other columns you can just ignore, hide, or delete. Or even better: select those three columns, copy them, and paste them into a new spreadsheet.
Auditing each asset request
With your pared-down spreadsheet, insert a new first column and label it “Purpose”. You can also include a Description/Comment column, if you wish.
Next, go down each row, line by line, and assign each asset request a purpose. I suggest something like the following:
Content (e.g., the core HTML document, images, media—the stuff users care about)
Function (e.g., functional JavaScript files that you have authored, CSS, webfonts)
Analytics (e.g., Google Analytics, New Relic, etc.)
Ads (e.g., Google DFP, any ad networks, etc.)
Your Purpose names can be whatever you want. What matters is that your labels for each purpose are consistent—capitalization and all. They need to group neatly in order to generate the fancy charts later. (Pro tip: use data validation on this column to ensure consistency in your spreadsheet.)
So how do you determine the purpose? Typically, the biggest clue is the “Host” column. You will, very quickly, start to recognize which hosts provide what. Your root URL will be where your document comes from, but you will also find:
CDN URLs like cloudfront.net, or cloudflare.com. Sometimes these have images (which are typically content); sometimes they host CSS or JavaScript files (functionality).
Analytics URLs like googletagservices.com, googletagmanager.com, google-analytics.com, or js-agent.newrelic.com.
Ad URLs like doubleclick.net or googlesyndication.com.
If you’re ever unsure of a URL, either try it out yourself in your browser, or literally google the URL. (Hint: if you don’t recognize the URL right away, it’s most likely ad-related.)
Mindfulness
Just doing the steps above will likely be eye-opening for you. Stopping to consider each asset on a page, and why it’s there, will help you be mindful of every single thing the page loads.
You may be in for some surprises the first time you do this. A few unexpected items might turn up. A script might be loaded more than once. That social widget might be a huge page weight. Requests coming from ads might be more numerous than you thought. That’s why I suggested a Description/Comment column—you can make notes there like “WTF?” and “Can we remove this?”
Augmenting your data
Before you can generate fancy pie charts, you’ll need to do a little more spreadsheet wrangling. Forewarned is forearmed—extreme spreadsheet nerdery lies ahead.
First, you need to translate the request sizes to kilobytes (KB), because they are initially supplied in bytes, and no human speaks in terms of bytes. Next to the column “Object Size,” insert another column called “Object Size (KB).” Then enter a formula in the first cell, something like this:
=E2/1000
Translation: you’re simply dividing the amount in the cell from the previous column (E2, in this case) by 1000. You can highlight this new cell, then drag the corner down the entire column to do the same for each row.
Totaling requests
Now, to figure out how many HTTP requests are related to each Purpose, you need to do a special kind of count. Insert two more columns, one labeled “Purpose Labels” and the second “Purpose Reqs.” Under Purpose Labels, in the first row, enter this formula:
=SORT(UNIQUE(B2:B),1,TRUE)
This assumes that your purpose assessment is column B. If it’s not, swap out the “B” in this example for your column name. This formula will go down column B and output a result if it’s unique. You only need to enter this in the first cell of the column. This is one reason why having consistency in the Purpose column is important.
Now, under the second column you made (Purpose Reqs) in the first cell, enter this formula:
=ARRAYFORMULA(COUNTIF(B2:B,G2:G))
This formula will also go down column B, and do a count if it matches with something in column G (assuming column G is your Purpose Labels column). This is the easiest way to total how many HTTP requests fall into each purpose.
Totaling download size by purpose
Finally, you can now also total the data (in KB) for each purpose. Insert one more column and call it Purpose Download Size. In the first cell, insert the following formula:
=SUM(FILTER($F$2:F,$B$2:B=G2))
This will total the data size in column F if its purpose in column B matches G2 (i.e., your first Purpose Label from the section above). In contrast to the last two formulas, you’ll need to copy this formula and modify it for each row, making the last part (“G2”) match the row it’s on. In this case, the next one would end in “G3”.
Make with the fancy charts
With your assets grouped by purpose, data translated to KB, number of requests counted, and download size totaled, it will be pretty easy to generate some charts.
The HTTP request chart
To make an HTTP request chart, select the columns Purpose Label and Purpose Reqs (columns G and H in my example), and then go to Insert > Chart. Scroll down the list of possible charts, and choose a pie chart. Be sure to check the box marked “Use column G as labels.”
Under the “Customization” tab, edit the Title to say “HTTP Requests”; under “Slice,” be sure “Value” is selected (the default is “Percentage”). We do this because the number of requests is what you want to convey here.
Go ahead—tweak the colors to your liking. And ditch Arial while you’re at it.
Download-size chart
The download-size-by-purpose pie chart is very similar. Select the columns Purpose Label and Purpose Download Size (columns G & I in my example); then go to Insert > Chart. Scroll down the list of possible charts and choose a pie chart. Be sure to check the box marked “Use column G as labels”.
Under the “Customization” tab, edit the Title to say “Download Size”; under “Slice,” be sure “Value” is selected as well. We do this so we can indicate the total KB for each purpose.
Or, you can grab a ready-made template. If you want to see a completed assessment, check out the one I did on an A List Apart article. I’ve also made a blank file with a lot of the trickier spreadsheet stuff already done. Feel free to go to File > Make a Copy so you can play around with it. You just need to get your page data from WebPagetest and paste in the three columns. After that, you can start your line-by-line assessment.
Telling the good from the bad
If you show your data to a stakeholder, they may be surprised by how much page weight goes to things like ads or analytics. On the other hand, they might respond by asking what we should be aiming for. That question is a little harder to answer.
Some benchmarks get bandied about—1 MB or less, a WebPagetest Score of 1000, a Google PageSpeed score of over 90, and so on. But those are very arbitrary parameters and, depending on your project, unattainable ideals.
My suggestion? Do an assessment like this on your competitors. If you can come back to your stakeholders and show how two or three competitors stack up, and show them what you’re doing, that will go much further in championing performance.
Remember that performance is never “done”—it can only improve. What might help your organization is doing assessments like this over time and presenting page performance as an ongoing series of bar charts. With a little effort (and luck), you should be able to demonstrate that the things your organization cares about are continually improving. If not, it will present a much more compelling case for why things need to change for the better.
So you have some pretty charts. Now what?
Your charts’ usefulness will vary according to the precise business needs and politics of your organization.
For instance, let’s say you’re a developer, and a project manager asks you to add yet another ad-metrics script to your site. After completing an assessment like the one above, you might be able to come back and say, “Ads already constitute 40 percent of our page weight. Do you really want to pile on more?”
Because you’ve ascribed purpose to your asset requests, you’ll be able to offer data like that. I once worked with a project manager who started pushing back on such requests because I was able to give them easy-to-understand data of this sort. I’m not saying it will always turn out this way, but you need to give decision makers information they can grasp.
Remember, too, that you are in charge of the Purpose column. You can make up any purpose you want. Interested in the impact that movie files have on your site relative to everything else? Make one of your purposes “Movies.” Want to call out framework files versus files you personally author? Go for it!
I hope that this article has made you want to consider, and reconsider, each and every thing you download on a given page. Each and every request. And, in the process of doing this, I hope you are equipped to call out by purpose every item you ask your users to download. That will allow you to talk with your stakeholders in a way that they understand, and will help you make the case for better performance choices.
Further reading:
My publicly available spreadsheet template
A sample completed assessment of an A List Apart article page
Google Sheets documentation on working with charts
WebPagetest documentation
http://ift.tt/2qxHUYr
0 notes
Text
The Mindfulness of a Manual Performance Audit
As product owners or developers, we probably have a good handle on which core assets we need to make a website work. But rarely is that the whole picture. How well do we know every last thing that loads on our sites?
An occasional web performance audit, done by hand, does make us aware of every last thing. What’s so great about that? Well, for starters, the process increases our mindfulness of what we are actually asking of our users. Furthermore, a bit of spreadsheet wizardry lets us shape our findings in a way that has more meaning for stakeholders. It allows us to speak to our web performance in terms of purpose, like so:
Want to be able to make something like that? Follow along.
Wait, don’t we have computers for this sort of thing?
A manual audit may seem like pointless drudgery. Why do this by hand? Can’t we automate this somehow?
That’s the whole point. We want to achieve mindfulness—not automate everything away. When we take the time to consider each and every thing that loads on a page, we get a truer picture of our work.
It takes a human mind to look at every asset on a page and assign it a purpose. This in turn allows us to shape our data in such a way that it means something to people who don’t know what acronyms like CSS or WOFF mean. Besides, who doesn’t like a nice pie chart?
Here’s the process, step by step:
Get your performance data in a malleable format.
Extract the information necessary.
Go item by item, assigning each asset request a purpose.
Calculate totals, and modify data into easily understood units.
Make fancy pie charts.
The audit may take half an hour to an hour the first time you do it this way, but with practice you’ll be able to do it in a few minutes. Let’s go!
Gathering your performance data
To get started, figure out what URL you want to evaluate. Look at your analytics and try to determine which page type is your most popular. Don’t just default to your home page. For instance, if you have a news site, articles are probably your most popular page type. If you’re analyzing a single-page app, determine what the most commonly accessed view is.
You need to get your network activity at that URL into a CSV/spreadsheet format. In my experience, the easiest way to do this is to use WebPagetest, whose premise is simple: give it a URL, and it will do an assessment that tries to measure perceived performance.
Head over to WebPagetest and pop your URL in the big field on the homepage. However, before running the test, open the Advanced Settings panel. Make sure you’re only running one test, and set Repeat View to First View Only. This will ensure that you don’t have duplicate requests in your data. Now, let the test run—hit the big “Start Test” button.
Once you have a results page, click the link in the top right corner that says “Raw object data”.
A CSV file will download with your network requests set out in a spreadsheet that you can manipulate.
Navigating & scrubbing the data
Now, open the CSV file in your favorite spreadsheet editor: Excel, Numbers, or (my personal favorite) Google Sheets. The rest of this article will be written with Google Sheets in mind, though a similar result is certainly possible with other spreadsheet programs.
At first it will probably seem like this file contains an unwieldy amount of information, but we’re only interested in a small amount of this data. These are the three columns we care about:
Host (column F)
URL (column G)
Object Size (column N)
The other columns you can just ignore, hide, or delete. Or even better: select those three columns, copy them, and paste them into a new spreadsheet.
Auditing each asset request
With your pared-down spreadsheet, insert a new first column and label it “Purpose”. You can also include a Description/Comment column, if you wish.
Next, go down each row, line by line, and assign each asset request a purpose. I suggest something like the following:
Content (e.g., the core HTML document, images, media—the stuff users care about)
Function (e.g., functional JavaScript files that you have authored, CSS, webfonts)
Analytics (e.g., Google Analytics, New Relic, etc.)
Ads (e.g., Google DFP, any ad networks, etc.)
Your Purpose names can be whatever you want. What matters is that your labels for each purpose are consistent—capitalization and all. They need to group neatly in order to generate the fancy charts later. (Pro tip: use data validation on this column to ensure consistency in your spreadsheet.)
So how do you determine the purpose? Typically, the biggest clue is the “Host” column. You will, very quickly, start to recognize which hosts provide what. Your root URL will be where your document comes from, but you will also find:
CDN URLs like cloudfront.net, or cloudflare.com. Sometimes these have images (which are typically content); sometimes they host CSS or JavaScript files (functionality).
Analytics URLs like googletagservices.com, googletagmanager.com, google-analytics.com, or js-agent.newrelic.com.
Ad URLs like doubleclick.net or googlesyndication.com.
If you’re ever unsure of a URL, either try it out yourself in your browser, or literally google the URL. (Hint: if you don’t recognize the URL right away, it’s most likely ad-related.)
Mindfulness
Just doing the steps above will likely be eye-opening for you. Stopping to consider each asset on a page, and why it’s there, will help you be mindful of every single thing the page loads.
You may be in for some surprises the first time you do this. A few unexpected items might turn up. A script might be loaded more than once. That social widget might be a huge page weight. Requests coming from ads might be more numerous than you thought. That’s why I suggested a Description/Comment column—you can make notes there like “WTF?” and “Can we remove this?”
Augmenting your data
Before you can generate fancy pie charts, you’ll need to do a little more spreadsheet wrangling. Forewarned is forearmed—extreme spreadsheet nerdery lies ahead.
First, you need to translate the request sizes to kilobytes (KB), because they are initially supplied in bytes, and no human speaks in terms of bytes. Next to the column “Object Size,” insert another column called “Object Size (KB).” Then enter a formula in the first cell, something like this:
=E2/1000
Translation: you’re simply dividing the amount in the cell from the previous column (E2, in this case) by 1000. You can highlight this new cell, then drag the corner down the entire column to do the same for each row.
Totaling requests
Now, to figure out how many HTTP requests are related to each Purpose, you need to do a special kind of count. Insert two more columns, one labeled “Purpose Labels” and the second “Purpose Reqs.” Under Purpose Labels, in the first row, enter this formula:
=SORT(UNIQUE(B2:B),1,TRUE)
This assumes that your purpose assessment is column B. If it’s not, swap out the “B” in this example for your column name. This formula will go down column B and output a result if it’s unique. You only need to enter this in the first cell of the column. This is one reason why having consistency in the Purpose column is important.
Now, under the second column you made (Purpose Reqs) in the first cell, enter this formula:
=ARRAYFORMULA(COUNTIF(B2:B,G2:G))
This formula will also go down column B, and do a count if it matches with something in column G (assuming column G is your Purpose Labels column). This is the easiest way to total how many HTTP requests fall into each purpose.
Totaling download size by purpose
Finally, you can now also total the data (in KB) for each purpose. Insert one more column and call it Purpose Download Size. In the first cell, insert the following formula:
=SUM(FILTER($F$2:F,$B$2:B=G2))
This will total the data size in column F if its purpose in column B matches G2 (i.e., your first Purpose Label from the section above). In contrast to the last two formulas, you’ll need to copy this formula and modify it for each row, making the last part (“G2”) match the row it’s on. In this case, the next one would end in “G3”.
Make with the fancy charts
With your assets grouped by purpose, data translated to KB, number of requests counted, and download size totaled, it will be pretty easy to generate some charts.
The HTTP request chart
To make an HTTP request chart, select the columns Purpose Label and Purpose Reqs (columns G and H in my example), and then go to Insert > Chart. Scroll down the list of possible charts, and choose a pie chart. Be sure to check the box marked “Use column G as labels.”
Under the “Customization” tab, edit the Title to say “HTTP Requests”; under “Slice,” be sure “Value” is selected (the default is “Percentage”). We do this because the number of requests is what you want to convey here.
Go ahead—tweak the colors to your liking. And ditch Arial while you’re at it.
Download-size chart
The download-size-by-purpose pie chart is very similar. Select the columns Purpose Label and Purpose Download Size (columns G & I in my example); then go to Insert > Chart. Scroll down the list of possible charts and choose a pie chart. Be sure to check the box marked “Use column G as labels”.
Under the “Customization” tab, edit the Title to say “Download Size”; under “Slice,” be sure “Value” is selected as well. We do this so we can indicate the total KB for each purpose.
Or, you can grab a ready-made template. If you want to see a completed assessment, check out the one I did on an A List Apart article. I’ve also made a blank file with a lot of the trickier spreadsheet stuff already done. Feel free to go to File > Make a Copy so you can play around with it. You just need to get your page data from WebPagetest and paste in the three columns. After that, you can start your line-by-line assessment.
Telling the good from the bad
If you show your data to a stakeholder, they may be surprised by how much page weight goes to things like ads or analytics. On the other hand, they might respond by asking what we should be aiming for. That question is a little harder to answer.
Some benchmarks get bandied about—1 MB or less, a WebPagetest Score of 1000, a Google PageSpeed score of over 90, and so on. But those are very arbitrary parameters and, depending on your project, unattainable ideals.
My suggestion? Do an assessment like this on your competitors. If you can come back to your stakeholders and show how two or three competitors stack up, and show them what you’re doing, that will go much further in championing performance.
Remember that performance is never “done”—it can only improve. What might help your organization is doing assessments like this over time and presenting page performance as an ongoing series of bar charts. With a little effort (and luck), you should be able to demonstrate that the things your organization cares about are continually improving. If not, it will present a much more compelling case for why things need to change for the better.
So you have some pretty charts. Now what?
Your charts’ usefulness will vary according to the precise business needs and politics of your organization.
For instance, let’s say you’re a developer, and a project manager asks you to add yet another ad-metrics script to your site. After completing an assessment like the one above, you might be able to come back and say, “Ads already constitute 40 percent of our page weight. Do you really want to pile on more?”
Because you’ve ascribed purpose to your asset requests, you’ll be able to offer data like that. I once worked with a project manager who started pushing back on such requests because I was able to give them easy-to-understand data of this sort. I’m not saying it will always turn out this way, but you need to give decision makers information they can grasp.
Remember, too, that you are in charge of the Purpose column. You can make up any purpose you want. Interested in the impact that movie files have on your site relative to everything else? Make one of your purposes “Movies.” Want to call out framework files versus files you personally author? Go for it!
I hope that this article has made you want to consider, and reconsider, each and every thing you download on a given page. Each and every request. And, in the process of doing this, I hope you are equipped to call out by purpose every item you ask your users to download. That will allow you to talk with your stakeholders in a way that they understand, and will help you make the case for better performance choices.
Further reading:
My publicly available spreadsheet template
A sample completed assessment of an A List Apart article page
Google Sheets documentation on working with charts
WebPagetest documentation
http://ift.tt/2qxHUYr
0 notes
Text
The Mindfulness of a Manual Performance Audit
As product owners or developers, we probably have a good handle on which core assets we need to make a website work. But rarely is that the whole picture. How well do we know every last thing that loads on our sites?
An occasional web performance audit, done by hand, does make us aware of every last thing. What’s so great about that? Well, for starters, the process increases our mindfulness of what we are actually asking of our users. Furthermore, a bit of spreadsheet wizardry lets us shape our findings in a way that has more meaning for stakeholders. It allows us to speak to our web performance in terms of purpose, like so:
Want to be able to make something like that? Follow along.
Wait, don’t we have computers for this sort of thing?
A manual audit may seem like pointless drudgery. Why do this by hand? Can’t we automate this somehow?
That’s the whole point. We want to achieve mindfulness—not automate everything away. When we take the time to consider each and every thing that loads on a page, we get a truer picture of our work.
It takes a human mind to look at every asset on a page and assign it a purpose. This in turn allows us to shape our data in such a way that it means something to people who don’t know what acronyms like CSS or WOFF mean. Besides, who doesn’t like a nice pie chart?
Here’s the process, step by step:
Get your performance data in a malleable format.
Extract the information necessary.
Go item by item, assigning each asset request a purpose.
Calculate totals, and modify data into easily understood units.
Make fancy pie charts.
The audit may take half an hour to an hour the first time you do it this way, but with practice you’ll be able to do it in a few minutes. Let’s go!
Gathering your performance data
To get started, figure out what URL you want to evaluate. Look at your analytics and try to determine which page type is your most popular. Don’t just default to your home page. For instance, if you have a news site, articles are probably your most popular page type. If you’re analyzing a single-page app, determine what the most commonly accessed view is.
You need to get your network activity at that URL into a CSV/spreadsheet format. In my experience, the easiest way to do this is to use WebPagetest, whose premise is simple: give it a URL, and it will do an assessment that tries to measure perceived performance.
Head over to WebPagetest and pop your URL in the big field on the homepage. However, before running the test, open the Advanced Settings panel. Make sure you’re only running one test, and set Repeat View to First View Only. This will ensure that you don’t have duplicate requests in your data. Now, let the test run—hit the big “Start Test” button.
Once you have a results page, click the link in the top right corner that says “Raw object data”.
A CSV file will download with your network requests set out in a spreadsheet that you can manipulate.
Navigating & scrubbing the data
Now, open the CSV file in your favorite spreadsheet editor: Excel, Numbers, or (my personal favorite) Google Sheets. The rest of this article will be written with Google Sheets in mind, though a similar result is certainly possible with other spreadsheet programs.
At first it will probably seem like this file contains an unwieldy amount of information, but we’re only interested in a small amount of this data. These are the three columns we care about:
Host (column F)
URL (column G)
Object Size (column N)
The other columns you can just ignore, hide, or delete. Or even better: select those three columns, copy them, and paste them into a new spreadsheet.
Auditing each asset request
With your pared-down spreadsheet, insert a new first column and label it “Purpose”. You can also include a Description/Comment column, if you wish.
Next, go down each row, line by line, and assign each asset request a purpose. I suggest something like the following:
Content (e.g., the core HTML document, images, media—the stuff users care about)
Function (e.g., functional JavaScript files that you have authored, CSS, webfonts)
Analytics (e.g., Google Analytics, New Relic, etc.)
Ads (e.g., Google DFP, any ad networks, etc.)
Your Purpose names can be whatever you want. What matters is that your labels for each purpose are consistent—capitalization and all. They need to group neatly in order to generate the fancy charts later. (Pro tip: use data validation on this column to ensure consistency in your spreadsheet.)
So how do you determine the purpose? Typically, the biggest clue is the “Host” column. You will, very quickly, start to recognize which hosts provide what. Your root URL will be where your document comes from, but you will also find:
CDN URLs like cloudfront.net, or cloudflare.com. Sometimes these have images (which are typically content); sometimes they host CSS or JavaScript files (functionality).
Analytics URLs like googletagservices.com, googletagmanager.com, google-analytics.com, or js-agent.newrelic.com.
Ad URLs like doubleclick.net or googlesyndication.com.
If you’re ever unsure of a URL, either try it out yourself in your browser, or literally google the URL. (Hint: if you don’t recognize the URL right away, it’s most likely ad-related.)
Mindfulness
Just doing the steps above will likely be eye-opening for you. Stopping to consider each asset on a page, and why it’s there, will help you be mindful of every single thing the page loads.
You may be in for some surprises the first time you do this. A few unexpected items might turn up. A script might be loaded more than once. That social widget might be a huge page weight. Requests coming from ads might be more numerous than you thought. That’s why I suggested a Description/Comment column—you can make notes there like “WTF?” and “Can we remove this?”
Augmenting your data
Before you can generate fancy pie charts, you’ll need to do a little more spreadsheet wrangling. Forewarned is forearmed—extreme spreadsheet nerdery lies ahead.
First, you need to translate the request sizes to kilobytes (KB), because they are initially supplied in bytes, and no human speaks in terms of bytes. Next to the column “Object Size,” insert another column called “Object Size (KB).” Then enter a formula in the first cell, something like this:
=E2/1000
Translation: you’re simply dividing the amount in the cell from the previous column (E2, in this case) by 1000. You can highlight this new cell, then drag the corner down the entire column to do the same for each row.
Totaling requests
Now, to figure out how many HTTP requests are related to each Purpose, you need to do a special kind of count. Insert two more columns, one labeled “Purpose Labels” and the second “Purpose Reqs.” Under Purpose Labels, in the first row, enter this formula:
=SORT(UNIQUE(B2:B),1,TRUE)
This assumes that your purpose assessment is column B. If it’s not, swap out the “B” in this example for your column name. This formula will go down column B and output a result if it’s unique. You only need to enter this in the first cell of the column. This is one reason why having consistency in the Purpose column is important.
Now, under the second column you made (Purpose Reqs) in the first cell, enter this formula:
=ARRAYFORMULA(COUNTIF(B2:B,G2:G))
This formula will also go down column B, and do a count if it matches with something in column G (assuming column G is your Purpose Labels column). This is the easiest way to total how many HTTP requests fall into each purpose.
Totaling download size by purpose
Finally, you can now also total the data (in KB) for each purpose. Insert one more column and call it Purpose Download Size. In the first cell, insert the following formula:
=SUM(FILTER($F$2:F,$B$2:B=G2))
This will total the data size in column F if its purpose in column B matches G2 (i.e., your first Purpose Label from the section above). In contrast to the last two formulas, you’ll need to copy this formula and modify it for each row, making the last part (“G2”) match the row it’s on. In this case, the next one would end in “G3”.
Make with the fancy charts
With your assets grouped by purpose, data translated to KB, number of requests counted, and download size totaled, it will be pretty easy to generate some charts.
The HTTP request chart
To make an HTTP request chart, select the columns Purpose Label and Purpose Reqs (columns G and H in my example), and then go to Insert > Chart. Scroll down the list of possible charts, and choose a pie chart. Be sure to check the box marked “Use column G as labels.”
Under the “Customization” tab, edit the Title to say “HTTP Requests”; under “Slice,” be sure “Value” is selected (the default is “Percentage”). We do this because the number of requests is what you want to convey here.
Go ahead—tweak the colors to your liking. And ditch Arial while you’re at it.
Download-size chart
The download-size-by-purpose pie chart is very similar. Select the columns Purpose Label and Purpose Download Size (columns G & I in my example); then go to Insert > Chart. Scroll down the list of possible charts and choose a pie chart. Be sure to check the box marked “Use column G as labels”.
Under the “Customization” tab, edit the Title to say “Download Size”; under “Slice,” be sure “Value” is selected as well. We do this so we can indicate the total KB for each purpose.
Or, you can grab a ready-made template. If you want to see a completed assessment, check out the one I did on an A List Apart article. I’ve also made a blank file with a lot of the trickier spreadsheet stuff already done. Feel free to go to File > Make a Copy so you can play around with it. You just need to get your page data from WebPagetest and paste in the three columns. After that, you can start your line-by-line assessment.
Telling the good from the bad
If you show your data to a stakeholder, they may be surprised by how much page weight goes to things like ads or analytics. On the other hand, they might respond by asking what we should be aiming for. That question is a little harder to answer.
Some benchmarks get bandied about—1 MB or less, a WebPagetest Score of 1000, a Google PageSpeed score of over 90, and so on. But those are very arbitrary parameters and, depending on your project, unattainable ideals.
My suggestion? Do an assessment like this on your competitors. If you can come back to your stakeholders and show how two or three competitors stack up, and show them what you’re doing, that will go much further in championing performance.
Remember that performance is never “done”—it can only improve. What might help your organization is doing assessments like this over time and presenting page performance as an ongoing series of bar charts. With a little effort (and luck), you should be able to demonstrate that the things your organization cares about are continually improving. If not, it will present a much more compelling case for why things need to change for the better.
So you have some pretty charts. Now what?
Your charts’ usefulness will vary according to the precise business needs and politics of your organization.
For instance, let’s say you’re a developer, and a project manager asks you to add yet another ad-metrics script to your site. After completing an assessment like the one above, you might be able to come back and say, “Ads already constitute 40 percent of our page weight. Do you really want to pile on more?”
Because you’ve ascribed purpose to your asset requests, you’ll be able to offer data like that. I once worked with a project manager who started pushing back on such requests because I was able to give them easy-to-understand data of this sort. I’m not saying it will always turn out this way, but you need to give decision makers information they can grasp.
Remember, too, that you are in charge of the Purpose column. You can make up any purpose you want. Interested in the impact that movie files have on your site relative to everything else? Make one of your purposes “Movies.” Want to call out framework files versus files you personally author? Go for it!
I hope that this article has made you want to consider, and reconsider, each and every thing you download on a given page. Each and every request. And, in the process of doing this, I hope you are equipped to call out by purpose every item you ask your users to download. That will allow you to talk with your stakeholders in a way that they understand, and will help you make the case for better performance choices.
Further reading:
My publicly available spreadsheet template
A sample completed assessment of an A List Apart article page
Google Sheets documentation on working with charts
WebPagetest documentation
http://ift.tt/2qxHUYr
0 notes
Text
A Students Prelude to Management and Computer Science
New Post has been published on https://netmaddy.com/a-students-prelude-to-management-and-computer-science/
A Students Prelude to Management and Computer Science
A friend of mine said, “I’ve visited your websites, viewed your articles, and took a look at your background.” It’s apparent to me, you do not have a background in Information Technology (Computer Science) or Business Intelligence as a field of study acknowledging Computer Science.
I wondered, does one have to have a background in IT or BI to qualify as a professional in the industry; or does it take a simple interest?
AutoCAD, C Language, Visual Basic, PowerPoint, JavaScript, Excel, Access, Cobol, Word (Microsoft Suite), Data Entry/Processing, DOS, Fortran, Lotus Notes, Management Information Systems, HTML, and Management cover a broad spectrum within the beginning or preparation in the world of IT / BI. The eclectic value of the specialization in one or more of this study group is the mastery and/or understanding of all.
About a week ago, I was contacted, recruited, and enrolled in a four-year college program. The recruiter was adept at what she was doing and I went along with the schedule as presented. I continually asked, “How am I going to pay for these college courses?” When the prepared documentation was submitted for the finalization process, it all came down to dollars and cents. The financial aid person finally made contact with me assuming that I was a prime candidate for the ‘Stafford Loan!’ This is a general education loan designed for students to enroll in a given college. I explained to the administrator that I am without money, have no wish to establish a loan (for anything), unemployed, and not to mention…my age is also a factor. With the ability to complete the required studies, I would be done in less than a year. I have three years of accredited college courses with an Associate Degree. I estimated that I would be paying off on a student loan for more than a few years if I landed a job. I will soon be sixty years of age. Burdened with an educational loan that would probably be on my plate well after I’m sixty-five, I asked the recruiter and financial aid worker, “Do you really believe that I would take out a loan for an extravagant amount of money in the hopes that I would get hired…at my age?” Even if I were to start up a new business, a loan would be a bit risky…improbable. It would have been a good thing if I were able to get back into the classroom.
Whatever happened to the H1-B Program, The NAFTA Treaty, or the “Financial Hardship” Claus with the “Obama Letter” and/or ex-workers that are forced into the utilization/awarding of UC Benefits?
I was promptly dropped from the student roles by the four-year college.
Meanwhile, I am offering data that may help anyone who is interested in the pursuit of computer science knowledge. Should you be attempting to enroll in computer science and management courses, this information will give you a leg up on what is entailed in this multi-faceted field of study. Many schools will teach these courses as pre-requisite(s) to advanced courses in a four-year college. Even if you are not enrolled or intending to enroll in school but only interested in how it all works, the limited amount of information that is detailed in this message will probably aid you in your preparation and/or research.
The outlined courses are:
COBOL (Common Business Oriented Language):
COBOL was first released in the 1960s as a joint venture of industry, universities, and the United States Government. COBOL’s purpose was to provide a high-level computer programming language for the business world. COBOL directly addresses the basic needs of information processing while being easy to use as well. (Take a look at SQL).
COBOL, BASIC, C, JAVA, and PASCAL are examples of high-end level computer language(s). A low-level language is a programming language requiring knowledge of a computers’ internal components…that are non-transferable.
Auto Cad (Computer Assisted Design):
2D (Dimensional) drafting tasks, allow you to get acquainted with computer assisted designing. Auto Cad is designed to assist you in the creation of landscape plans, including setting up layers, adding text and dimensions while making modifications. You can create electrical diagrams using symbols and attributes.
You are taught how to extract the attributes into an Excel Spreadsheet Program. Boolean operations and modeling construct and analyze complex 2D shapes and images for isometric drafting, a method for simulating 3D drawings and Lt drawings. Explaining the use of Auto Cad, one can learn to embed DWF (Drawing Web format) files in web pages. An overview of Auto Cad and progressive projects teaches you how to create drawing projects, landscape plans, and/or electrical schematics. The Internet-related topics include direct access to particular websites, opening and saving, drawings on the web, and embedding DWF files in a web page,
C Language:
The available text on C Language enables the student to be taught both a rational approach to program development and an introduction to ANSI C. Because the first goal is primary, a disciplined approach to solving problems and applying widely accepted software engineering methods to design program solutions as cohesive, readable, and reusable modules. ANSI C (American National Standards Institute), is a standardized, industrial-strength programming language known for its power and portability. C Language helps the student consolidate their understanding of pointers as arrays, output parameters, and file accesses. Just prior to their exploration of the role of the pointer in dynamic memory allocation.
C Language is widely perceived as a language to be tackled only after one has learned the fundamental of programming in some other friendlier language. Designers as a vehicle for programming the UNIX operating system, C Language found its original clientele among programmers who understood the complexities of the operating system and the underlying machine concepts that are not in the syllabus of a standard introductory programming course. C Language is for computer science majors and/or students of a wide range of other IT/BI disciplines.
Visual Basic:
The easiest and fastest way to write 32-Bit Windows-Based programs is the Microsoft Visual Basic Programming System. One can learn to work with ActiveX controls, compiler options, and new development tools. You can master programming fundamentals, including variables, decision structures, loops, and functions. Creating custom dialog boxes, clocks, menus, animation effects, managing text files, encryption, and sorting algorithms are learned through the utilization of Visual Basic Programming. VB also adds dimension and automation to integrate Microsoft Excel, Microsoft Word, Microsoft Outlook, and other features into an application. Other examples of the integrational power of Visual Basic include the ability to explore ActiveX controls that process RTF (Rich Text Format), run videos, display progress information, and play audio compact discs (CDs). You can also call the memory management function in the Windows API (Application Program Interface), download FTP (File Transfer Protocol), and HTML (Hyper Text Markup Language) files from the Internet and design DHTML (Dynamic Hyper Text Markup Language) pages, exploit ActiveX data objects (ADO) with learned skills from Visual Basic.
PowerPoint:
PowerPoint is a computer presentation graphics package. It gives you everything you need to produce a professional-looking presentation, i.e., word processing, outlining, drawing, graphing, and presentation management tools. A formal presentation to a large audience using 35mm slides of a more intimate presentation in a small conference room using overhead monitors, and/or an email presentation – PowerPoint has it all! The user is empowered with an outline to help organize his/her thoughts, an on-screen slide show with special effects such as animated bullet points, speakers notes, and audience handouts. Users of PowerPoint create color schemes, masters, and templates…there are ways to create the look you want for your presentation.
JavaScript:
It is supposedly easy according to some JavaScript authors. To start a simple script that makes cool things happen on your web page…in addition to more complicated stuff, as you need it.
Because the web is a dynamic medium, page designers want their pages to interact with the user. It soon became obvious that HTML was insufficient to handle the demand. JavaScript was invented by Netscape to control the web browser, and add pizzazz and interactivity to your web pages.
Excel:
Objectives – to reach the fundamentals of Microsoft Excel, to expose students to examples of the computer as a useful tool, to develop an exercise – oriented approach that will allow students to learn by example and to encourage independent study. Students are introduced to Excel terminology, the excel window, and basic characteristics of a worksheet and workbook. The applications include entering text, numbers, selecting a range using the auto sum button, copying using the fill handle, changing font size, bolding, centering across columns and rows (columns and fields), the auto format command, charting using the chart wizard, and the auto calculate area throughout the grid of columns and rows of the Excel spreadsheet. Any form of accounting, be it business, personal, or otherwise, Excel is a must study program for recording, charting, and analytics.
Access:
Microsoft Access includes two tools that provide assistance in helping to refine the design of an Access database. The GUI (Graphic User Interface) Development Environment of Microsoft Access, with menu commands, toolbars, buttons, tooltips, examples, and help screens make development easier. Sound, quality relational database design and development requires considerable knowledge and expertise, no matter what the platform. Access, a Relational Data Base Management System, has the ability to manage data files from a single database. A must study course for any and all Data Base Administration, Business Administration, Secretarial Administration, and Computer Science students.
Word (CMOU – Certified Microsoft Office User):
Creating and Editing word documents; Wizards and Templates to create a Cover letter and Resume; creating a Research Paper with a Table; creating Web Pages; creating a document with a Title Page and Tables; generating Form Letters, Mailing Labels, and Envelopes; creating a Professional Newsletter, and using Word Art to add Special Text Effects to a Word document.
DOS (Direct Operating System):
Before Windows, there was DOS. With just a few mouse clicks, any Windows PC can revert to the original “Disk Operating System.” Under DOS, all program files are named with either a COM, and EXE, or a BAT ending (called a filename extension). The DIR (Directory) Command is used to find files by name as well as to locate files in other Sub Directories on a disk. The output of the DIR command shows a list of files on a disk. The list has five columns: the file’s name, the file’s extension (part of the name), the file’s size (in bytes or characters), the date the file was created or last modified, and the time of the last modification (changes).
Lotus Notes:
Lotus Notes is a Document-Centric Database Management System. Lotus Notes is a Cross-Platform, Secure, Distributed Document-Oriented Database, Messaging Framework and Rapid Application Development Environment that includes Per-Built Applications. Lotus Notes is an Integrated Desktop Client Option for accessing business email, and Groupware System. Lotus Notes operates as the Client Side of a Client – Server Application.
Fortran (A Scientific Language):
Formula Translation – was designed to allow easy translation of math formulas into a code of High-End Language. Fortran was designed n the 1950s. It used the first compiler (A program that translates source code into object code) ever developed. Fortran was designed to be a programming language that would be suitable for a wide variety of applications while being easy to learn
Fortran expresses mathematical functions as it permits severely complex mathematical functions to be expressed similarly to regular algebraic notations.
RDBMS (Relational Database Management Systems):
RDBMS was designed for the business organization. It requires extremely careful planning, setting up and maintenance. A database is a collection of information that’s related to a particular subject or purpose, such as tracking customer orders or maintaining a music collection. If your database isn’t stored on a computer, or only parts of it are, you may be tracking information from a variety of sources that you have to coordinate or organize yourself. Access can manage all or your information from a single database file, within the file, divide your data into separate storage containers called tables; view, add, and update by using forms; find and retrieve just the data you want by using queries; and analyze or print data in specific layout by using reports. RDBMS Systems allow users to view, update, or analyze the database’s data from multiple locations. When the data is updated, it is automatically updated everywhere it appears.
Information Management Systems (M.I.S.):
MIS combines tech with business to get users the information they need to do their jobs Better Smarter and Faster. MIS Systems are planned systems of the collecting, processing, storing, and disseminating data in the form of information that is needed to carry out the functions of management. The system(s) consist of people, equipment, and procedures to gather, sort, analyze, evaluate, and distribute needed, timely, and accurate information to decision makers – “The Right Information to the Right People At The Right Time!”
MIS is actually Information Technology Management and arguably not considered to be computer science. Armed with this information, the contingent, aspiring, Computer Science, Business Administration, Secretarial Sciences, Computer Hardware (A plus), and Accounting Student(s) will be prepared to face the challenges the IT/BI industry and the respective colleges have to dish out.
My friend and other cynics have caused me to wonder after comments were made. I wondered, what does it take…what form of study qualifies as a computer science student with a major in IT/BI? Well, I’ve studied all of the aforementioned programs and/or courses with an acceptable level of understanding, study, utilization, and practice…not to mention all of the other technological software/programs, articles, periodical reports, and white papers involved in the learning process? Is it due to my background and experience in the Transportation / Hospitality / Customer Service Industry for a good many years? Or was it in fact… IT/BI Study/Research was secondary? “One Never Knows…Do One?”
0 notes