#webp is the BEST format for web images and has been standard for years and yet...non-web software is lagging behind on adoption
Explore tagged Tumblr posts
in-case-of-grace ¡ 1 year ago
Text
What the fuck now my website takes 100x longer to load TRANSMUTE IT BACK WHO LET THE TECH ILLITERATE MAGES FUCK UP MY WEBSITE AGAIN?!
the mage's guild has performed a dark ritual to transmute webp to png
13K notes ¡ View notes
wofox906 ¡ 3 years ago
Text
11.11.0
Tumblr media Tumblr media
Gitlab 11.11.0
11/11/08 Judge Penny Brown Video Youtube
What Property Is 11-11=0
About Internet Explorer 11
Internet Explorer 11 11.0.11 is the version of IE which was released in order to take advantage of added capabilities in Windows 8. IE11 on Windows 8 brings an entirely new browsing experience and set of capabilities to the Web, such as a new touch first browsing experience and full screen UI for your sites, security improvements that offer the best protection against the most common threats on the Web, improved performance, and support for the HTML5 and CSS3 standards developers need. With this new release, Windows 7 customers receive all of the performance, security, and under-hood changes that enable a stellar Web experience. This download is licensed as freeware for the Windows (32-bit and 64-bit) operating system/platform without restrictions. Internet Explorer 11 is available to all software users as a free download for Windows.
Is Internet Explorer 11 safe to download?
Gitlab 11.11.0
Defense Enrollment Eligibility Reporting System (DEERS) Register family members in the Defense Enrollment Eligibility Reporting System (DEERS) for TRICARE and other benefits. Active-duty and retired Service Members are automatically registered in DEERS.
11.11.0.0/22 (AS8003 Global Resource Systems, LLC) Netblock IP Address Information.
Dec 27, 2020 JDK 11.0.11 contains IANA time zone data 2020e, 2020f, 2021a. Volgograd switches to Moscow time on 2020-12-27 at 02:00. South Sudan changes from +03 to +02 on 2021-02-01 at 00:00. For more information, refer to Timezone Data Versions in the JRE Software.
Kaspersky Security Center 11 version 11.0.0.1131 was released on March 14, 2019. Kaspersky Security Center is a single administration console for controlling all Kaspersky Lab security solutions and system administration tools that you use. It makes every endpoint and device on your network more visible, simplifies IT administration tasks,. Notes: Computers connected to a network are assigned a unique number known as Internet Protocol (IP) Address. IP (version 4) addresses consist of four numbers in the range 0-255 separated by periods (i.e. A computer may have either a permanent (static) IP address, or one that is dynamically assigned/leased to it.
We tested the file IE11-Windows6.1-KB2976627-x86.msu with 25 antivirus programs and it turned out 100% clean. It's good practice to test any downloads from the Internet with trustworthy antivirus software.
Does Internet Explorer 11 work on my version of Windows?
Older versions of Windows often have trouble running modern software and thus Internet Explorer 11 may run into errors if you're running something like Windows XP. Conversely, much older software that hasn't been updated in years may run into errors while running on newer operating systems like Windows 10. You can always try running older programs in compatibility mode.
Officially supported operating systems include 32-bit and 64-bit versions of Windows Server 2016, Windows 10, Windows Server 2012, Windows 8 and Windows 7.
What versions of Internet Explorer 11 are available?
The current version of Internet Explorer 11 is 11.0.11 and is the latest version since we last checked. This is the full offline installer setup file for PC. At the moment, only the latest version is available.
Development of this software has ceased, and therefore Internet Explorer 11 is obsolete, so future updates should not be expected.
What type of graphical file formats are supported?
Internet Explorer 11 supports over 4 common image formats including EPS, PLY, SVG and WEBP.
Alternatives to Internet Explorer 11 available for download
Sleipnir Browser
A web browser optimized for beauty and font rendering based on Mozilla's Gecko rendering engine.
Firefox Portable
Portable version of the versatile Firefox browser with support for passwords, history, bookmarks and of course the endless number of extensions and..
NVIDIA Direct3D SDK
A collection of DirectX 11 code samples to create 3D graphics in Windows.
Avant Browser
Fast Internet Explorer-based web browsing adding multi-processor support using a low amount of memory that features a video downloader, anti-freeze and..
Midori Portable
Portable version of a fast web browsing featuring several privacy and ad-blocking options.
Pale Moon Portable
Web browser with improved page loading speed which disables less-used features on the Firefox code.
Waterfox Portable
A very fast browser running 64-bit code of Mozilla Firefox.
Intel Driver Update Utility
Midori
A lightweight web browser which focuses on user privacy and blocking web advertisements.
Bing Desktop
Adds Bing functionality and wallpapers to your desktop.
Believer, if your inheritance is meager, you should be satisfied with your earthly portion; for you may rest assured that it is best for you. Unerring wisdom ordained your lot and selected for you the safest and best condition. When a ship of large tonnage is to be brought up a river that has a large sandbank, if someone should ask, 'Why does the captain steer through the deep part of the channel and deviate so much from a straight line?' his answer would be, 'Because I could not get my ship into harbor at all if I did not keep to the deep channel.'
In the same way you would run aground and suffer shipwreck if your divine Captain did not steer you into the depths of affliction where waves of trouble follow each other in quick succession. Some plants die if they have too much sunshine. It may be that you are planted where you get only a little, but you are put there by the loving Farmer because only in that situation will you produce fruit unto perfection.
11/11/08 Judge Penny Brown Video Youtube
Remember this: If any other condition had been better for you than the one in which you are, divine love would have put you there. You are placed by God in the most suitable circumstances, and if you could choose your lot, you would soon cry, 'Lord, choose my heritage for me, for by my self-will I am pierced through with many sorrows.' Be content with the things you have, since the Lord has ordered all things for your good. Take up your own daily cross; it is the burden best suited for your shoulder and will prove most effective to make you perfect in every good word and work to the glory of God. Busy self and proud impatience must be put down; it is not for them to choose, but for the Lord of Love!
Trials must and will befall— But with humble faith to see Love inscribed upon them all, This is happiness to me.
What Property Is 11-11=0
One-Year Bible Reading Plan
Send me the Daily Devotional
Tumblr media
0 notes
fastcompression ¡ 6 years ago
Text
JPEG2000 vs JPEG vs PNG: What's the Difference?
JPEG2000 vs JPEG vs PNG
If you look for a list of image format standards with good compression ratio, a simple Google search will yield a lot of results. JPEG and the similar sounding JPEG2000, along with PNG, are among the best image compression formats today.
That being said, each of these formats has their particular strengths and weaknesses. For us to be able to distinguish one from another, we have to look at each one separately. Once we have described each of the three image formats, we will compare them together, so you can clearly see how they differ, and which is right for you.
Tumblr media
There are other well-known raster image formats, which were not included in our comparison. GIF is actively used nowadays for animations, but it is limited by 256 color palettes. TIFF is a classical lossless format with support of extended precision (16 bits per channel), but it has weak compression and is not supported by most of the web browsers. There are also a number of newer formats, like JPEG XR, WebP and HEIF, which are not really popular due to very restricted support in web browsers and image processing software.
What is JPEG?
The acronym JPEG stands for Joint Photographic Expert Group (the name is derived from the company who made it). It first appeared on the stage in 1986 but is still the most popular imaging format today.
JPEG should not be confused with JPEG2000. These names are alike, because both standards were proposed by the same company, but they are completely different algorithms and formats; JPEG2000 is more recent and much more sophisticated one.
JPEG is originally lossy format, which means that encoding always causes loss of quality. The compression ratio can be significantly increased at the cost of more losses. It is the main feature, which made it so popular for compression of photographic images. They usually have smooth variations of brightness and color gradients allowing JPEG to achieve combination of good compression ratio with decent quality. However, the nature of JPEG algorithm causes appearance of blocking artifacts (especially noticeable near sharp edges with high contrast), which can be distractive at high compression ratios.
JPEG Features
The JPEG compression algorithm has several important features, which allowed it to gain impressive popularity:
Color space transformation allows to separate brightness (Y) and chrominance (Cb, Cr) components. Downscaling of Cb and Cr allows reducing file size with almost unnoticeable losses of quality.
Quantization after Discrete Cosine Transform allows to control reduction of image size by rounding coefficients for sharp (high-frequency) details.
Optional progressive encoding allows to show low-quality preview of the whole image after partial decoding of its byte stream.
Lossless entropy coding for DCT-transformed and quantized image data.
Pros and Cons of JPEG
When looked at as a whole, the features of JPEG make it a dependable format. Here are some of its advantages:
This format has been in use for quite a long time
Almost all devices can support JPEG, which is not the case for JPEG2000
It is compatible with most of the image processing apps
JPEG images can be compressed up to 5% of their initial size. It makes JPEG format more suitable one when it comes to transferring images over the Web
JPEG codec could be very fast on CPU and especially on GPU
Disadvantages of JPEG include:
Quality loss is inevitable after encoding and each iteration of import/ export
Due to ringing and blocking artifacts it distorts images with sharp edges, which become harder to recognize
Only 1 or 3 color channels of 8/12-bit depth are supported
Does not offer transparency preservation for images (no separate alpha-channel)
What is JPEG2000?
It’s easy to assume based on name alone that JPEG2000 (or J2K) is similar in nature to JPEG. The truth is, all the two has in common is name. J2K algorithm was developed 8 years later after JPEG took the stage and was seen at that time as the JPEG successor. The main idea behind JPEG2000 development was to create more flexible and more functional compression algorithm with better compression ratio.
JPEG2000 coding system is powered by a wavelet-based technology, which allows to choose between mathematically lossless and lossy compression within a single architecture (and even within a single codestream). Discrete Wavelet Transform (DWT) processes the image as a whole, which prevents blocking artifacts compared to JPEG.
The use of DWT and binary arithmetic coder allowed to achieve higher compression ratio compared to JPEG, especially at low bitrates. Although the compression performance was cited as a primary driver for the developers’ activity, in the end applications have been attracted to it by its other advantages.
The codestream obtained after compression is highly-scalable due to the use of EBCOT scheme (Embedded Block Coding with Optimal Truncation). J2K allows to select order of progression of resolution, quality, color components and position supplying multiple derivatives of the original image. By ordering codestream in various ways, applications can achieve significant performance increases or flexibly adapt to varying network bandwidth during transmission of image sequence. For example, gigapixel J2K-image can be viewed with a little delay, because only display-size version can be read and decoded from the whole file. Another example is ability to obtain visually-lossless image from the losslessly compressed master image, which can save time and bandwidth.
This format supports very large images (up to 232 – 1 on each dimension), multiple components (up to 16384 components for multi-spectral data), higher dynamic range (1–38 bits per component), where each component can have different resolution and bit depth.
Actually, JPEG2000 is a whole family of standards, consisting of 12 parts. Its first part “Core Coding System” specifies basic feature set (encoding and decoding processes, codestream syntax, file format) and is free to use without payment and license fees. Amongst additional parts are extensions giving more flexibility (extended file format JPX, Part 2), Motion JPEG 2000 (file format MJ2, Part 3), multi-layer compound images (file format JPM, Part 6), security framework (Part 8), communication protocol JPIP (Part 9), three-dimensional extension (JP3D, Part 10), etc.
Despite all its advantages, JPEG2000 format didn't become as ubiquitous as its developers thought it would be for various reasons. If we compare JPEG2000 and JPEG, J2K is more complex and computationally demanding, so until recently (before sufficient development of processors and parallel algorithms) it was too slow in many practical cases. Another problem was that neither manufacturers nor regular customers were ready to adopt it in early 2000s.
Today JPEG2000 is considered to be a niche format and is mostly seen when acquiring images from scanners, medical imaging devices, cameras, images from satellites, digital cinema, and high-end technical imaging equipment. However, now JPEG2000 have already reached maturity, have got support of many consumer software, and there are solutions to most of the possible problems. So it still has potential for growth of acceptance and popularity.
JPEG2000 Features
The most efficient way to understand the difference between JPEG and JPEG2000 is by looking at each of their features. Knowing this, helps us form a relationship between the two to highlight the differences even more. The following are some of the most important features of JPEG2000:
Single architecture for lossless and lossy compression (even within a single image file)
Highly-scalable codestream – ability to supply versions of image with different resolutions or quality from a single file
Support of very large size, multiple components, very high dynamic range (up to 38 bits per component)
High compression (especially at low bitrates)
Error resilience (robustness to bit errors when communication or storage devices are unreliable)
Fast random access to different resolutions, components, positions and quality layers
Region-of-Interest (ROI) on coding and access
Support for domain-specific metadata in JP2 file format
Very low loss of quality across multiple decoding/encoding cycles
Creation of compression image with specified size or quality
Pros and Cons of JPEG2000
JPEG2000 has some amazing features, and the advantages of using this image format over others are pretty impressive as well. Here are some of the reasons why you might want to use JPEG2000:
Has single compression architecture for both lossy and lossless compressions
One master image replaces multiple derivatives (different resolutions and quality)
Suits well for video production and working with live TV content
Works well with natural photos as well as synthetic visual content
Resilience to bit-errors.
JPEG2000 also has the following disadvantages:
It is not supported by web browsers (except Safari)
JPEG2000 is not compatible with JPEG. It takes additional time and efforts to integrate JPEG2000 into the system or a product even if it already uses JPEG algorithm
Standard open-source JPEG2000 codecs are too slow for active use
What is PNG?
PNG (or Portable Network Graphics) is another format that was created for lossless image compression. Today PNG is the most popular image format on websites, and it is also expected to be the eventual replacement of GIF format, which is still actively used for animations. Actually, the replacement of GIF was the main motivation for creating PNG format, because patented GIF required license and has well-known limit of 256 color palettes.
PNG uses non-patented lossless compression algorithm Deflate, which is a combination of LZ77 and Huffman coding. The progressiveness feature of PNG is based on optional 2-dimensional 7-pass interlacing scheme, which, however, reduce compression ratio when used.
PNG file size depends on color depth (up to 64 bits per pixel), predictive filter on precompression stage, implementation of Deflate compression, optional interlacing, optional metadata. Several options for lossy compression were developed for this format: posterization (reduction of number of unique colors), advanced palette selection techniques (reduction of 32-bit colors to 8-bit palette), lossy averaging filter.
Although GIF supports animation, it was decided that PNG should be a single-image format. However, in 2008 the extension of PNG called APNG (animated PNG) was proposed, and now it is supported by all major web-browsers except Microsoft IE/Edge. Moreover, even Edge will gain its support soon, because in December 2018 Microsoft announced using Chrome’s Blink engine in the Edge browser while discontinued development of its own proprietary browser engine EdgeHTML.
PNG has support of color correction data (gamma, white balance, color profiles). Correction is needed because the same numeric color values can produce different colors on different computer setups even with identical monitors. However, practical usage of this feature may become a problem, and this information is often removed by PNG optimization tools.
PNG Features
PNG has several main features that allowed it to become the most popular lossless format for raster synthetic images. Let’s briefly look at each one:
Lossless compression
Support of alpha-channel transparency (unique among the most popular in web image formats)
7-pass progressiveness
PNG compression algorithm is able to process true-color, grayscale, and palette-based types of images from 1-bit to 16-bit (unlike the JPEG that supports only the first two and only for 8 or 12 bits)
Several choices of trade-off between compression ratio and speed
Pros and Cons of PNG
PNG compression is a practical one and that makes it a really popular tool for storage and transmission of synthetic and computer-generated graphical images. Here are some additional advantages of this format:
Wide support by web browsers and other software
No patent issues
Alpha channel for adjustable transparency of pixels (opacity)
High dynamic range (up to 16 bits per channel)
PNG is not perfect and has its own drawbacks too:
No inherent support of lossy compression
Low compression ratio due to outdated compression algorithm
No inherent support of animation (only in extensions such as APNG)
What is better: JPEG vs JPEG2000 vs PNG
JPEG2000
Advantages
Both lossy and lossless compression
Flexible progressive decoding
Very good image compression ratio
Error resilience
Disadvantage
Not universally supported by browsers
Very high computational complexity
JPEG
Advantages
Compatible with all web browsers
Supported by almost all image processing software and devices
Very fast either on CPU or GPU
Disadvantages
No lossless mode in the original standard
Blocking artifacts
No transparency preservation
PNG
Advantages
Compatible with all web browsers
Reliable lossless compression
Full transparency control
Disadvantages
Not suitable for strong lossy compression
Low compression ratio
Conclusion
Each of these three image formats can be useful for different tasks. JPEG is compatible with most devices and hardware, so it can be used almost everywhere today though with some quality limitations. JPEG2000, on the other hand, is more useful for maintaining high quality of images and dealing with real-time TV content, while PNG is more convenient for online transfer of synthetic images. Each of them has unique properties that can be applied for storing and processing images in different situations.
Original article see here: https://www.fastcompression.com/blog/jpeg-j2k-png-review.htm
0 notes
isopodhours ¡ 2 years ago
Text
No for real, you should be fucking furious about this and here's why:
The current JPEG format fucking sucks. It's from 1992 (happy 30th birthday to the worst image format ever!) and has bad lossy compression and doesn't support transparency or HDR and is generally just horrible. You probably know this. So did its creators, the Joint Photographic Experts Group.
Enter JPEG 2000, a format released eight years after the original that was superior in about every way imaginable. It sported a vastly superior lossy compression algorithm, support for transparency, HDR, and LOSSLESS compression (optionally), among other benefits. For reasons I can't possibly fucking imagine, it died. It simply didn't get wide enough adoption and fizzled out. Not a day goes by where I don't mourn what could have been.
Anyway, recently they tried again. That's what JPEG XL is, except even better this time. JPEG XL is like JPEG 2000, but with an even better compression algorithm, animation support, perfect backwards compatibility with original 1992 JPEG files, thousands of spare channels for extra image data (think thermal information, depth mapping, etc.), layering support (technically), I could fucking go on. This shit would have blown every existing image format from original JPEG to PNG to GIF out of the fucking water in every possible way and it had UNIVERSAL industry support. We're talking Adobe, Facebook (I know), Intel, almost anyone you care to name with even a tangential stake in the digital media landscape... except for Google.
Google, for whatever godforsaken reason, isn't on board with this. My guess is they want to push their (inferior, mind you) format, WEBP. JPEG XL support was already implemented in Chromium, you just had to set a hidden flag to enable it. They could have just turned it on by default for everyone anytime they wanted, but as stated in the post I linked above, they've decided instead to remove it completely.
Google Chrome has a nearly 80% market share with desktop web browsers. Add in the handful of remotely popular forks of it and that number gets closer to 90% for Chromium-based browsers in general. This means Google has a functional monopoly over what most web standards look like, because anything Chrome doesn't support simply won't work for the overwhelming majority of internet users. Google is in a position here to be the sole arbiter of whether JPEG XL lives or dies, regardless of what everyone else wants, and it's chosen to kill it. Unless by some miracle Google decides to reverse its decision, the (in my opinion) single best image file format ever invented is dead in the water because the monopolistic megacorp that controls 80% of humanity's primary means of viewing those images (presumably) assumed it could make a buck by killing it.
This shit is why I've been saying not to use Chrome. This shit is why we can't afford to let the alternatives die. Because if this one company has the power to hold back this kind of technological progress on a whim, who knows what kind of damage they'll do next if left unchecked. Today they killed an image format nobody's heard of. Tomorrow, maybe email becomes synonymous with gmail. Maybe some proprietary functionality gets added to Chrome that no other browser can support and websites start using it, making them unusable in anything but Google™ Chrome™. Maybe the boundary between website and app gets eroded even further until as far as most people are concerned, they're literally the same thing, with all the evils of the latter and none of the benefits of the former. Allowing any corporate entity the kind of power Google has is dangerous in a way I cannot even begin to articulate.
I hope I'm not coming off too fearmonger-y here, but for the sake of everything remotely good about the internet, fucking use Firefox. It doesn't even have to be Firefox. Anything not based on Chrome works. Safari, fucking Internet Explorer, I don't care. If you can at all afford to do so, get Chromium the fuck out of your life now. And not just because of what happened to JPEG XL. Google and its ilk have a vested interest in seeing every bit of openness, standardization, and decentralization that makes the internet great destroyed. If Google had its way, web browsers as we know them would at best be some obscure hacker tool that you're either a huge nerd or suspicious for even knowing how to use, if they existed at all.
TL;DR: JPEG XL is the best image format ever invented by a wide margin, Google killed it by axing support for it in Chrome, and it's fucking existentially terrifying that they have the power to do that. Use Firefox.
Chrome is removing jpeg XL support btw
38 notes ¡ View notes
unixcommerce ¡ 7 years ago
Text
YOOtheme Pro Review: Strong Theme and Page Builder for WordPress and Joomla
Meta: This YOOtheme review focuses on YOOtheme Pro features, functionalities and the corresponding pricing, plus overall efficacy.
For now, forget about all the fancy sales language you know. Or your products’ amazing and exceptional features. There’s only one thing that will make your traffic stay immediately they land on your site.
Yes, you’re dead right. It all comes down to your web design.
And the numbers are astonishing. Basically, 94% of your traffic would not trust a site with poor design. Chances are, they’ll just leave to engage other businesses with better-designed sites.
That’s why I’ve always been extremely keen about the themes and templates I adopt for my sites, especially if they come from third-party providers.
Speaking of which, WordPress users have always been lucky when it comes to this. We have a wide range of options to choose from, including dedicated theme providers, and page-builders that also come with their own set of professionally-designed themes.
And you know what? The providers are not having it easy. Each additional tool means increased competition. So, of course, they have to keep reinventing themselves to provide better, well-optimized themes, layouts, and templates to survive.
Now, I’ve followed a couple of promising services through this journey, and I have to admit that YOOtheme has outstandingly improved its themes quite substantially over the years.
While I knew that the solution was in for the long haul, I never, not even once, predicted that they would ultimately go for the long ball.
Let’s face it. YOOtheme Pro caught many of us by surprise. Because although they are closely tied, themes and page building are two utterly different ballgames.
But guess what? That’s exactly what makes this move exceptionally intriguing. According to their team, YOOtheme Pro is the most powerful theme and page builder for WordPress and Joomla. They are not playing around here. They mean business.
So, let’s see how much of that they can actually deliver. This YOOtheme review focuses on YOOtheme Pro features, functionalities and the corresponding pricing, plus overall efficacy.
How good is it, really?
But first, let’s see what YOOtheme is all about.
YOOtheme Reviews: Overview
YOOtheme Pro might be new. But the company has been around for quite some time. For longer than a decade, to be precise.
Joomla and WordPress themes plus plugins might have been their principal focus all along. But you’d be mistaken to assume that that’s pretty much all they’ve been doing.
That said, have you ever heard of Ulkit?  It’s basically an open source front-end framework for web interface development.
If not, what about Pagekit? That’s another open source solution, which is essentially a modern intuitive CMS.
Well, YOOtheme is also the brains behind these two projects. Both of them created from their Hamburg, Germany headquarters. Quite a number of remarkable solutions to their name, to say the least.
In retrospect, the first move was made back in 2007 by Steffan and Sascha, in one of their basements. YOOtheme has since grown exponentially, to host more than 150,000 customers. Ulkit has also managed to attract an admirably extensive fan-base, going by the half a million sites it has helped build.
The experts and creatives at YOOtheme continue striving to develop what they call “most cutting-edge web software”. Hence the introduction of YOOtheme Pro to further empower users on WordPress and Joomla.
So, how about assessing just how powerful it actually is?
Well, let’s dive in.
YOOtheme Pro Reviews: Features
Page Builder
Let’s start off at the top. With what is considered the core offering on YOOtheme Pro.
To keep everything simple and intuitive, the page builder is built around the well-known drag-and-drop functionality. This makes the whole design interface clean and pleasantly ideal for both developers and inexperienced builders.
The subsequent editing process is equally straightforward. You can systematically structure your pages into grids, rows, and columns to create an attractive layout that visitors can easily follow through.
I’m particularly fond of the masonry effect option, which allows users to establish neat layouts with multiple columns. The resultant gap-free system even looks great on pages with varying grid cell lengths.
If you find this a bit too monotonous, you can throw in the parallax effect to come up with an extensively dynamic page outlook.
When you’re done with the general structure, you can shift to edit the finer details that ultimately determine a page’s functions and features. For this, thankfully, YOOtheme Pro provides not just the basic element options like “Image” and “Heading”, but also advanced ones like “Slideshow”.
The slideshow element, for instance, offers five different systems of animations, optimized for both PC and mobile. Plus, you can embed both videos and images to achieve a distinctively refreshing and modern website.
And guess what? Using all these tools doesn’t require any coding experience. You can have a website in just minutes without hiring a developer.
But, don’t get me wrong. While it’s possible to create a page in a couple of minutes, you should be extremely careful with the whole process. A perfect, well-optimized page is best created when you put your mind to it, thinking through all the possible options.
In the layout library, for example, it’s advisable to analyze all the premium layouts to select the most suitable one. Well, of course, this is easier said than done since YOOtheme has engaged a team of professional to churn out attractive and trendy templates. You’ll possibly be spoilt for choice here since most of them look like they can fit perfectly on any website.
To help you sort through the entire heap, YOOtheme Pro allows you to filter by topic. Consequently, you’ll be able to conveniently select a layout that suits your preferences and business needs. And changing the general outlook of your WordPress site is as simple as a single click.
Now, even with a wide array of professionally designed layouts and templates, it’s impossible to address all possible user preferences. So, to work around this, YOOtheme pro also supports extensive customization. You can adjust pretty much anything- from colors, typography, element sizes, fonts, spacing, and position, to global settings for PC and mobile.
Local Google Fonts
And speaking of font customization, YOOtheme handles Google fonts locally. If you choose to proceed with any of the Google font styles, the corresponding files are automatically downloaded to your site’s server and embedded into the CSS.
But, why is this even a big deal?
Normally, introducing an extra font, admittedly, would be a good thing. And although this would be welcome, admit it.  It wouldn’t be exciting enough for a bottle of champagne.
But Google Fonts are different in every sense of the word. Storing them locally, for starters, drastically improves your Google page loading speed. The browser doesn’t have to make a roundtrip to the Google servers because everything is available locally.
Secondly, it sorts out the whole issue of GDPR-compliance. Your traffic’s privacy is adequately assured as a result.
Integrated Unsplash Library
It might seem like a small and negligible problem at first. But, when you come to think of it, finding ideal stock images has got to be one of the biggest challenges for website owners.
We’d all probably be walking around with digital cameras, looking for picturesque shots for our sites. Or maybe steal a couple of copyrighted ones, which would, of course, invite Google’s penalty whips. But then sites like Unsplash came to the rescue with an extensive array of stock images.
Now, sourcing for images from Unsplash, for most site owners, requires you to first download to your PC before uploading to the site. Not much of a problem for small sites. But it comes quite a hassle when you’re dealing with multiple pages.
Thankfully, YOOtheme Pro has made things much easier by bringing the library to you. Unsplash is now seamlessly integrated into the service’s media manager, allowing you to search and lift images directly. You can also filter the images and scan through the various collections.
Finally, instead of downloading the images to your PC, they are simply added to your site’s media folder when you save your layout. It really is that simple.
Developer Support
YOOtheme is simple, with a solid list of elements, and is universally customizable. Plus, of course, the drag-and-drop functionality is intuitive, and should go well with the range of layouts available.
Now, that pretty much covers everything standard users need to comfortably build a site without any coding experience. User-friendliness and flexibility.
But that’s not all that YOOtheme provides. One interesting fact about it is that it doesn’t lock out experienced coders.
Well, you could use the standard page building functionalities like regular users. Or alternatively, capitalize YOOtheme Pro’s expandable and modular framework to code your way to a well-customized site.
This provision essentially allows you to override all elements, and introduce your own custom themes and layouts.
And since it’s not always a smooth process, YOOtheme Pro provides comprehensive documentation with precise details on all the customization options. It should come in handy before you finally learn the ropes.
Overall Features
YOOtheme Pro also provides:
One-click updates
WooCommerce integration for ecommerce
Numerous blog options
Customized footers
Three mobile header layouts
16 header layout options
User-friendly color picker
More than 125 icons
Global user interface components
WebP image format
Automatically generated scrsets
Lazy-loading images
Max width breakpoint
Mobile optimization
Extensive style customizer
Modernized layouts
Thumbnail navigation
YOOtheme Pro Reviews: Pricing
Sadly, there is no free option here. You have to pay to use YOOtheme Pro.
And that brings us to another downside. For a service that doesn’t come with a free package, we expect at least a limited free-trial period. But YOOtheme is having none of that either.
The only thing you get is a 30 day money-back guarantee period. If you don’t like its offerings, you can request for a full refund.
That said, there are three standard packages:
Basic- € 49
Risk-free guarantee
Technical support
Regular updates
Access to all themes
Subscription for 3 months
Updates for 1 site
Standard- € 99
All Basic features
Subscription for 12 months
Updates for 3 sites
Developer- € 299
All Standard features
Unsplash integration
For WordPress and Joomla
Subscription for 12 months
Updates for unlimited sites
Who Should Consider Using YOOtheme Pro
To recap, let’s first review the key takeaways:
Basically, 94% of your traffic would not trust a site with poor design.
YOOtheme has outstandingly improved its themes quite substantially over the years.
According to their team, YOOtheme Pro is the most powerful theme and page builder for WordPress and Joomla.
To keep everything simple and intuitive, the page builder is built around the well-known drag-and-drop functionality. This makes the whole design interface clean and pleasantly ideal for both developers and inexperienced builders.
YOOtheme Pro provides not just the basic element options like “Image” and “Heading”, but also advanced ones like “Slideshow”.
If you choose to proceed with any of the Google font styles, the corresponding files are automatically downloaded to your site’s server and embedded into the CSS.
Unsplash is now seamlessly integrated into YOOtheme Pro media manager, allowing you to search and lift images directly.
You could also capitalize YOOtheme Pro’s expandable and modular framework to code your way to a well-customized site. This provision essentially allows you to override all elements, and introduce your own custom themes and layouts.
YOOtheme Pro comes with a 30 day money-back guarantee period. If you don’t like its offerings, you can request for a full refund.
Evidently, YOOtheme Pro attempts to cater to both standard users and developers. It combines simplicity for non-coders with modular flexibility for experienced coders. Quite a tricky balance there, but so far so good for this solution.
At the moment, YOOtheme Pro is ideal for standard websites and blogs. Ecommerce blogs, on the other hand, are better off with services that come with specialized tools for building and managing stores.
All in all, I have to admit that YOOtheme is doing very well for a relatively new page builder, although it still has a couple of features to catch up on. But going by the frequency of new feature rollouts, I predict that YOOtheme could possibly be the next big thing in the WordPress page building space. For now, let’s wait and see.
The post YOOtheme Pro Review: Strong Theme and Page Builder for WordPress and Joomla appeared first on Inspired Magazine.
http://inspiredm.com/
The post YOOtheme Pro Review: Strong Theme and Page Builder for WordPress and Joomla appeared first on Unix Commerce.
from WordPress https://ift.tt/2LHGyIr via IFTTT
0 notes
photomaniacs ¡ 8 years ago
Photo
Tumblr media
Photos and Color Profiles: The Quickly Approaching Move to Wide-Gamut http://ift.tt/2wEPrLF
My name is Kelly Thompson, and I’m a VP at 500px. Buried in Tuesday’s announcement of Google’s Android Oreo was an interesting tidbit for photographers: like Apple the year before, Google’s mobile OS has been reworked to support deep and wide color, and, for the first time, full color management for Android devices.
What does this mean for photographers and their workflows? It’s probably a good time to review your processes to make sure you’re getting the best possible results for the widest audience.
At 500px, we’ve made some significant changes to better support the amazing new screens, while also trying to cut down on file sizes.
20+ Years of sRGB
Do you remember what monitors looked like 20 years ago? It wasn’t pretty. In 1996, Microsoft and HP developed the sRGB standard. It has served its purpose well, but it’s definitely showing its age.
When it comes to sRGB on 500px, we’ve always done two things. To be the most widely compatible and the most space efficient we have:
1. Converted non-sRGB images to sRGB 2. Stripped the sRGB profile from the image
The first step allowed us to be compatible with the broadest set of users. Until recently, most screens were sRGB calibrated or weren’t calibrated at all (which usually meant they were close enough to sRGB). This meant anyone with wide-gamut displays wouldn’t get to see the images uploaded in wide-gamut profiles (Adobe RGB, ProPhoto RGB, P3). Unfortunate, but it meant most people saw a fairly accurate representation of the intended look.
The second step often confuses people, but if you think about it from the point of view of a site delivering billions of image views, it starts to make a lot of sense. The standard sRGB color profile is over 3KB in size. Multiply that out and it’s a lot of wasted data when browsers assume that an untagged image is sRGB (at least they should! More on this later). Stripping the color profiles from the images saves 25-30% in data transferred. On a small 5KB thumbnail, it effectively cuts the file size in half.
As mentioned above, the W3C consortium states untagged images should be treated as sRGB. Google Chrome does NOT do this if you have a wide-gamut display. Instead, it renders untagged images in the profile of the display rather than sRGB. Photographers with calibrated displays (many, many of them!) end up seeing noticeably wrong-looking images.
The color bars below demonstrate the Chrome issue. The top bar is an sRGB file. The bottom is the same file, untagged. They should look exactly the same. They won’t if you’re on Chrome with a wide-gamut display.
Other companies had faced similar issues. Facebook took an interesting approach – they optimized the sRGB profile down to just 524 bytes and renamed it ‘TINYsRGB’. It produces results that are indistinguishable from the original profile, so we rolled the TINYsRGB profile out broadly to the community. It kept files sizes down and fixed the Chrome issue.
The Windy Road to Wide-Gamut
With that issue solved, we moved on to delivering full wide-gamut across the board. Photographers have long been shooting in wide-gamut (notably Adobe RGB) and RAW formats, but most were converting to sRGB before uploading. When iPhones with Display-P3 cameras and screens arrived, we saw a large uptick in the number of images being delivered in wide-gamut format. The phones plus the new P3 displays on the iMac & MacBook Pros meant that many pros had a wide-gamut workflow available for the first time. Other companies have followed suit and delivered great wide-gamut displays of their own (i.e. the Microsoft Studio and the beautiful new Dell displays for example). It was time for the switch.
Most photographers think that when they upload a file on 500px, the same image gets delivered back out when requested. That isn’t the case. Our image resizer generates the required file based on screen resolution, required image size, quality level, watermark requirements, etc.
In order to support wide color, we updated our rendering pipeline to stop converting images to sRGB and deliver the wide-gamut versions when available. People with wide-gamut displays finally started seeing those images at their best.
An image of pencil crayons in the Adobe RGB color profile by JosĂŠ de JesĂşs Cervantes Gallegos shows the full color range.
This image displays the colors that sRGB CAN represent as grayscale, while keeping the Adobe RGB-only spectrum as color. The vibrant blues, rich teals and brilliant oranges are not visible in sRGB.
Except if you were using an Android phone. Android versions before Oreo didn’t support color management at all. This was a spectacular shame since the screens are exceptional – wide-gamut, highly accurate, with massive dynamic range thanks to their OLED technology. What did this mean? The phones served up clownishly oversaturated colors in photos with color profiles other than sRGB.
Owners of Android phones that can upgrade to Oreo are going to be in for a pleasant surprise. These are going to be some of the best-looking screens to view images on – ever. It does, however, create a bit of a conundrum – there are almost no OLED laptop screens yet. The screen in your pocket will handily beat your desktop or laptop one. The displays we process our images on really need to catch up.
Here we see a vibrant Display P3 profiled image. Image by Kaanchana Kerr.
In this image, we see the same image with colors outside of the sRGB spectrum kept as color and those viewable in sRGB dropped to grayscale.
Seeing Wide-Gamut Images on 500px
If you’ve got a wide-gamut display and would like to see some images in action, we have just the thing. We’ve added a filter to 500px’s web search that allows you to narrow your results to three different wide-gamut options. It’s a great way to see what your fellow photographers are up to.
WebP vs JPEG Compression
Years ago after the WebP format came out, we investigated using it on the site. It didn’t make much sense for us then due to the way our backend system worked with caching and the lack of support for the format. It’s also heavily optimized for highly compressed images – something we don’t really serve at 500px (on the JPEG quality level, we’re in the 80 to 85% range). The changes we’ve made to our rendering system over the years (and recent changes to prep for more modern file formats), as well as the extreme popularity of Chrome as a browser, made us take another look at the format.
Here’s a JPEG photo by Artur Borzęcki.
This WebP image looks better (especially around edge detail and subtle background gradients) and is 14% smaller.
The WebP image on the right looks better (especially around edge detail and subtle background gradients) and is 14% smaller. Image by Artur Borzęcki.
It took us many rounds of testing to get WebP images we were happy with. The algorithm excels at edge detail where JPEG utterly falls down but tended to smear out details in subtly textured areas of an image. Because we compress thumbnails at a higher rate than the full images, we had better luck with them. There are also lots of edge cases. Sometimes images looked significantly better (especially grainy images) but were actually larger than their JPEG equivalent.
Overall, we’re very happy with where we’ve landed – about a 25% overall reduction in file size for browsers that support the format (Chrome, basically). We’ve also rolled it out to our latest Android app.
What it Means for You, the Photographer
If you’ve been shooting RAW all these years and have kept your original files, you’re in luck. They contain loads of extra data that outputting them as sRGB JPEGs would have lost. But you can go back and reprocess the RAW files, exporting to any wide-gamut space you like.
Just keep in mind it’s heavily recommended you have a wide-gamut monitor to see what you’re actually doing to the file. This is going to be even more exciting when we get distributable formats that support more than 8-bits per channel. It’s especially important for wide-gamut images, as the wider gamut means you’re stretching those 8 bits across a wider area, resulting in the dreaded banding. Remember to process these types of images in 16-bits-per-channel color depth (or higher) in Photoshop or your favorite image software to keep them at the highest possible quality before exporting.
If you’re uploading to 500px, we’re keen to accept your wide-gamut files. If you’ve been uploading images from our app on an iPhone 7 or 7 Plus, you’ve already been submitting Display P3 profiled images. For the best-looking files, make sure you’re delivering us images bigger than 2048 pixels per side. We create up to 20 different file sizes from your original, so a large image reduced in size can hide a multitude of sins affecting the original. The more we have to work with the better!
On the Marketplace side of 500px, we have always sold the files in the color space they were delivered in, so that won’t change. Previewing them on the site will give you the full gamut experience as well.
It’s been an exciting few years for color, technology, cameras, and photography. As things continue to progress, 500px will continue to try and deliver the best-looking images possible with the latest tech at our disposal. We look forward to seeing what you produce!
About the author: Kelly Thompson is the VP of Strategy, R&D, Community Management at 500px.
Go to Source Author: Kelly Thompson If you’d like us to remove any content please send us a message here CHECK OUT THE TOP SELLING CAMERAS!
The post Photos and Color Profiles: The Quickly Approaching Move to Wide-Gamut appeared first on CameraFreaks.
August 25, 2017 at 07:04PM
0 notes
hotspreadpage ¡ 8 years ago
Text
Site speed tactics in a mobile-first world: Why ecommerce brands need to step up their site speed game
A study of 700 top ecommerce brands found that the majority are underperforming when it comes to optimizing their sites for speed. Find out how you can avoid the same mistakes.
Web users are not patient. The speed of your site can make a massive difference to whether people will visit your site, whether they’ll stay on it, and whether they will come back. Not to mention whether they’ll make a purchase.
A massive 79% of shoppers who have been disappointed by a site’s performance say that they’re less likely to use the site again. But what constitutes ‘disappointed’?
We’re only human after all
Kissmetrics research on customer reactions to site speed has resounded across the industry, but it’s not something that should be forgotten:
“If an e-commerce site is making $100,000 per day, a 1 second page delay could potentially cost you $2.5 million in lost sales every year.”
That’s a 7% reduction in your conversion rate, and 52% of customers say site speed is a big factor in their site loyalty. A one second delay is bad – a two second delay is worse. 47% of consumers expect web pages to load within two seconds.
But based on the same research, a faster full-site load leads to a 34% lower bounce rate, and an improvement by just one second results in a 27% conversion rate increase.
It’s because site speed is such a vital part of building a successful ecommerce site that my team at Kaizen and I conducted a study into 700 top UK ecommerce sites, analyzing various aspects of their site speed performance.
What we found is that the biggest brands have some of the poorest optimization, with outdated web protocols, unresponsive pages, and bloated page size.
The average web page size is now 2.3MB (that’s the size of the shareware version of the classic game Doom), so we wanted to see whether the ecommerce industry is any better – since their businesses are directly dependent on their website performance.
Surprisingly, we have found that the web page size of the top UK ecommerce sites is 30% larger on average than standard websites – at 2.98 MB.
Average webpage size according to HTTPArchive
However, the web page size isn’t the only factor impacting the site speed. Even larger sites load and render quickly if they’re smart about how they deliver.
My team and I picked the top 700 UK ecommerce sites, based on their estimated monthly traffic with data kindly supplied by SimilarWeb. For each, we analysed them using Google’s PageSpeed Insights API, checked their page size and loading time on Pingdom, and verified their HTTP protocol using https://http2.pro/.
From this, we found the following data, and used it to determine which websites are best optimised for speed:
PageSpeed Insights Desktop Score (not considering third party data)
PageSpeed Insights Mobile Score (not considering third party data)
HTTP/2
Web page size
Loading Time
Loading Time per MB
Desktop vs mobile
Mobile connections are usually slower than desktop to begin with, so further delays are felt even more keenly. This, together with the fact that Google’s latest mobile update now factors site speed into mobile page ranking, makes it a high value consideration to ensure mobile pages are sufficiently optimized.
This becomes even more of a consideration when you factor in how much of ecommerce traffic is now mobile – for example Vodafone, the third top-scoring website in our recent research, receives only 20% of their traffic from desktop, with 80% coming from mobile devices.
Make your site work for you
Your site speed isn’t simply a dial you can turn up in your page settings; there are a number of factors which contribute to it. here’s what they are, and how you can start making your site one of the fastest out there.
Protocol power
HTTP/1.1 isn’t SPDY enough
Network protocols are the rules and standards that govern the end points in a telecommunication connection – how data is transmitted over the web. Common examples include IP – Internet Protocol – and HTTP – Hypertext Transfer Protocol.
The HTTP/1.1 protocol is decades old and doesn’t make full use of newer technologies. Its main downside is it doesn’t allow you to download files in parallel. For each file (or request), the server needs a separate connection.
HTTP/1.1 enables only one request per connection, while browsers now support a maximum of 6 connections per domain. This means that the number of files which can be downloaded and rendered simultaneously is limited – and that costs time.
Since the time of HTTP/1.1, Google has developed a newer version of the protocol, SPDY (“Speedy”), which allows simultaneous connections to be opened, and means it can serve multiple parts of the website (JavaScript, HTML, images, etc.) in parallel.
But SPDY isn’t the latest protocol developed by Google. Working closely with W3C (World Wide Web Consortium), they’ve developed the new HTTP/2 protocol. HTTP/2 has roughly the same characteristics as SPDY, but is also binary, and allows the server to ‘push’ information to the requester, with better HPACK compression.
Despite the clear advantages of the HTTP/2 protocol, only a few websites have made use of it. Our recent research discovered that only 7.87% of the top 700 ecommerce sites use the technology – compared to 11.3% of sites overall. Some examples of websites using HTTP/2 are https://www.vans.co.uk/, http://ift.tt/2nFuJmE or http://ift.tt/1yeQXOW.
According to Cloudflare.com, when they implemented HTTP/2, they saw customers’ average page load time nearly halved – from 9.07 seconds for HTTP/1.X falling to 4.27 seconds for HTTP/2. That’s a significant improvement in a key area of website efficiency.
However, HTTP/2 doesn’t solve everything, and in some cases the results can be disappointing. In our research, many websites achieved only very small speed gains in their loading times when served over HTTP/2 instead of HTTP/1.1.
Switching to HTTP/2 isn’t enough by itself – many websites fail to optimize for the change and lose out on the maximum speed gains.
Old-school techniques, such as domain sharding or sprites, can be counter-productive. And using huge CSS or JavaScript files where less than 10% of the rules and code is relevant to pages likely to be visited is a waste of both your user’s time and your server’s time.
Screenshot from Dareboost comparison analysis of Oliver Bonas’ loading performance
Even our own measurements showed that the average loading time per 1 MB for websites supporting HTTP/2 was 1.74s, compared to 1.44s for websites not supporting HTTP/2.
A nice example of a successful HTTP/2 optimisation is Paperchase, who saved a full second of time necessary to load their website, as is shown here:
Screenshot from Dareboost comparison analysis of Paperchase loading performance
How To Tackle Protocols – HTTP/2 and you
If you want to be at the forefront of network protocols – and at the top of the list of faster sites – get an HTTP/2 protocol in place.
While HTTP/2 only requires the server to support the new protocol (many now do, though Microsoft’s IIS has no plans yet), the browsers need a TLS connection. This means every connection over HTTP/2 will be safe and secure, adding an extra layer of security to the internet.
For more information on how you can get started with HTTP/2, have a look at the Kaizen in-depth article here.
It’s all about size
The smaller, the better
If you’re trying to get speed up, you need to keep size down. The less there is to move from the Internet to the user, the less time it takes.
As I mentioned earlier in this article, the ecommerce sites looked at in our study were bigger on average than other webpages out there – 30% bigger, at 2.98 MB, compared to a global standard of 2.3MB.
Format, compress, minify
One of the biggest issues on plus-sized websites is pictures. Unless they’re compressed and resized to suitable formats, they can be over-large and slow page speed to a crawl.
The solution to that problem explains itself – compress and resize – but less obvious fixes can be found in using the appropriate file type. The .png format makes files smaller if they’re in block coloring and simple – like infographics, illustrations and icons.
But for photographs, with a wide number of colors and much finer details, .png can compromise the quality of the image. You might consider using .jpg files instead, or .WebP, an open source image type format from Google, which supports both lossy and lossless compression.
Using correctly sized, unscaled images manually can be quite a daunting task for web developers. PageSpeed modules from Google can come in handy, automating many of the tasks necessary for site speed optimization.
You can also minify the source codes. CSS and JavaScript resources could be minified using tools like http://ift.tt/1iVdwRT and http://cssminifier.com/ – and should save kilobytes otherwise spent on whitespace.
The HTML should be also as compact as possible. We recommend stripping out all the unnecessary whitespace and empty lines.
Time to go mobile
Not very responsive
Most retailers in the study had mobile-optimized sites, but 24% of them served their mobile users a separate mobile site – usually on a separate sub domain. While this approach improves UX, it can be inconvenient for two reasons:
1)    Google handles subdomains as separate domains.
2)    Depending on how the redirects based on viewport are set up, in the new, mobile-first index world, this can mean that the Googlebot (visiting with smartphone user agent) will have troubles reaching the desktop version of the site.
A safer solution can be to use a responsive site that delivers the same HTML code to all devices, but adapts to the size and shape of the device used. We found that this had representation on only 76% of the sites.
Alarmingly, mobile sites themselves were largely poorly-optimized for mobile; the average mobile site scored 53.9/100 for speed, as opposed to the average desktop score of 59.4/100.
Hewlett Packard had a massive difference of 29 points between their desktop score (at 77/100) and their mobile (48/100), while the worst offenders were Carat London, who scored zero for both mobile and desktop score.
Here is the list of the top 10 websites based on Google’s Page Speed Insights:
URL Desktop Score Mobile Score Total PageSpeed Score http://ift.tt/zYfofk 97 95 192 http://www.ikea.com 95 88 183 http://ift.tt/WkkIWh 83 99 182 http://ift.tt/1gO9yDs 92 85 177 http://www.wynsors.com/ 89 88 177 http://ift.tt/1lsbHeE 90 86 176 http://ift.tt/1gvzM45 80 95 175 http://ift.tt/Ao2JgX 88 86 174 http://ift.tt/107wloP 89 85 174 http://ift.tt/1g0CSt8 90 84 174
Mobile management
Much of the mobile optimization requires coding and/or web development skills, but worry not – Google have created a guide to delivering a mobile page in under a second.
AMP it up
AMP – Accelerated Mobile Pages – is Google’s initiative for producing more efficient webpages for mobile. It’s a work-in-progress, but every day brings new developments and more support, customization and stability.
AMP pages have a number of benefits for all sites, including being preferred by Google in search rankings, and being faster to load. For ecommerce it’s a technology to implement ASAP, or at least keep an eye on.
While AMP debuted for publishing sites, recent updates have brought ecommerce sites into the fold, and eBay in particular have been quick on the uptake, now serving over eight million pages through the AMP system. Google is also working with eBay on things like A/B testing and smart buttons.
“With items like these in place, AMP for ecommerce will soon start surfacing,” says Senthil Padmanabhan, the principal engineer of eBay.
Customization on ecommerce AMP pages is currently low, but it’s an ideal technology for the industry, allowing customers quicker transitions between products – improving conversion rates and making the website easy to use.
During testing on the websites in our study, AMP was found to have a 71% faster load speed for blog posts, and a reduced page size from 2.3MB to 632kB.
Onwards and upwards
Site speed isn’t a problem that’s going to go away. As time goes by, the technology improves – AMP and HTTP/2 are just the latest steps on the road to real-time loading. 5G is on the horizon, and customers are only going to become less patient with slow-loading pages.
As a result, it’s increasingly necessary to keep an eye on your site analytics and your customer behavior. A speed improvement of just one second can improve your conversion rate by 27% – and a delay of one second can cost you millions a year.
Make sure you’re on top of bringing your ecommerce business and site into the modern era with the tips I’ve listed here.
Site speed tactics in a mobile-first world: Why ecommerce brands need to step up their site speed game syndicated from http://ift.tt/2maPRjm
0 notes
kellykperez ¡ 8 years ago
Text
Site speed tactics in a mobile-first world: Why ecommerce brands need to step up their site speed game
A study of 700 top ecommerce brands found that the majority are underperforming when it comes to optimizing their sites for speed. Find out how you can avoid the same mistakes.
Web users are not patient. The speed of your site can make a massive difference to whether people will visit your site, whether they’ll stay on it, and whether they will come back. Not to mention whether they’ll make a purchase.
A massive 79% of shoppers who have been disappointed by a site’s performance say that they’re less likely to use the site again. But what constitutes ‘disappointed’?
We’re only human after all
Kissmetrics research on customer reactions to site speed has resounded across the industry, but it’s not something that should be forgotten:
“If an e-commerce site is making $100,000 per day, a 1 second page delay could potentially cost you $2.5 million in lost sales every year.”
That’s a 7% reduction in your conversion rate, and 52% of customers say site speed is a big factor in their site loyalty. A one second delay is bad – a two second delay is worse. 47% of consumers expect web pages to load within two seconds.
But based on the same research, a faster full-site load leads to a 34% lower bounce rate, and an improvement by just one second results in a 27% conversion rate increase.
It’s because site speed is such a vital part of building a successful ecommerce site that my team at Kaizen and I conducted a study into 700 top UK ecommerce sites, analyzing various aspects of their site speed performance.
What we found is that the biggest brands have some of the poorest optimization, with outdated web protocols, unresponsive pages, and bloated page size.
The average web page size is now 2.3MB (that’s the size of the shareware version of the classic game Doom), so we wanted to see whether the ecommerce industry is any better – since their businesses are directly dependent on their website performance.
Surprisingly, we have found that the web page size of the top UK ecommerce sites is 30% larger on average than standard websites – at 2.98 MB.
Average webpage size according to HTTPArchive
However, the web page size isn’t the only factor impacting the site speed. Even larger sites load and render quickly if they’re smart about how they deliver.
My team and I picked the top 700 UK ecommerce sites, based on their estimated monthly traffic with data kindly supplied by SimilarWeb. For each, we analysed them using Google’s PageSpeed Insights API, checked their page size and loading time on Pingdom, and verified their HTTP protocol using https://http2.pro/.
From this, we found the following data, and used it to determine which websites are best optimised for speed:
PageSpeed Insights Desktop Score (not considering third party data)
PageSpeed Insights Mobile Score (not considering third party data)
HTTP/2
Web page size
Loading Time
Loading Time per MB
Desktop vs mobile
Mobile connections are usually slower than desktop to begin with, so further delays are felt even more keenly. This, together with the fact that Google’s latest mobile update now factors site speed into mobile page ranking, makes it a high value consideration to ensure mobile pages are sufficiently optimized.
This becomes even more of a consideration when you factor in how much of ecommerce traffic is now mobile – for example Vodafone, the third top-scoring website in our recent research, receives only 20% of their traffic from desktop, with 80% coming from mobile devices.
Make your site work for you
Your site speed isn’t simply a dial you can turn up in your page settings; there are a number of factors which contribute to it. here’s what they are, and how you can start making your site one of the fastest out there.
Protocol power
HTTP/1.1 isn’t SPDY enough
Network protocols are the rules and standards that govern the end points in a telecommunication connection – how data is transmitted over the web. Common examples include IP – Internet Protocol – and HTTP – Hypertext Transfer Protocol.
The HTTP/1.1 protocol is decades old and doesn’t make full use of newer technologies. Its main downside is it doesn’t allow you to download files in parallel. For each file (or request), the server needs a separate connection.
HTTP/1.1 enables only one request per connection, while browsers now support a maximum of 6 connections per domain. This means that the number of files which can be downloaded and rendered simultaneously is limited – and that costs time.
Since the time of HTTP/1.1, Google has developed a newer version of the protocol, SPDY (“Speedy”), which allows simultaneous connections to be opened, and means it can serve multiple parts of the website (JavaScript, HTML, images, etc.) in parallel.
But SPDY isn’t the latest protocol developed by Google. Working closely with W3C (World Wide Web Consortium), they’ve developed the new HTTP/2 protocol. HTTP/2 has roughly the same characteristics as SPDY, but is also binary, and allows the server to ‘push’ information to the requester, with better HPACK compression.
Despite the clear advantages of the HTTP/2 protocol, only a few websites have made use of it. Our recent research discovered that only 7.87% of the top 700 ecommerce sites use the technology – compared to 11.3% of sites overall. Some examples of websites using HTTP/2 are https://www.vans.co.uk/, https://www.paperchase.co.uk/ or https://www.expedia.co.uk/.
According to Cloudflare.com, when they implemented HTTP/2, they saw customers’ average page load time nearly halved – from 9.07 seconds for HTTP/1.X falling to 4.27 seconds for HTTP/2. That’s a significant improvement in a key area of website efficiency.
However, HTTP/2 doesn’t solve everything, and in some cases the results can be disappointing. In our research, many websites achieved only very small speed gains in their loading times when served over HTTP/2 instead of HTTP/1.1.
Switching to HTTP/2 isn’t enough by itself – many websites fail to optimize for the change and lose out on the maximum speed gains.
Old-school techniques, such as domain sharding or sprites, can be counter-productive. And using huge CSS or JavaScript files where less than 10% of the rules and code is relevant to pages likely to be visited is a waste of both your user’s time and your server’s time.
Screenshot from Dareboost comparison analysis of Oliver Bonas’ loading performance
Even our own measurements showed that the average loading time per 1 MB for websites supporting HTTP/2 was 1.74s, compared to 1.44s for websites not supporting HTTP/2.
A nice example of a successful HTTP/2 optimisation is Paperchase, who saved a full second of time necessary to load their website, as is shown here:
Screenshot from Dareboost comparison analysis of Paperchase loading performance
How To Tackle Protocols – HTTP/2 and you
If you want to be at the forefront of network protocols – and at the top of the list of faster sites – get an HTTP/2 protocol in place.
While HTTP/2 only requires the server to support the new protocol (many now do, though Microsoft’s IIS has no plans yet), the browsers need a TLS connection. This means every connection over HTTP/2 will be safe and secure, adding an extra layer of security to the internet.
For more information on how you can get started with HTTP/2, have a look at the Kaizen in-depth article here.
It’s all about size
The smaller, the better
If you’re trying to get speed up, you need to keep size down. The less there is to move from the Internet to the user, the less time it takes.
As I mentioned earlier in this article, the ecommerce sites looked at in our study were bigger on average than other webpages out there – 30% bigger, at 2.98 MB, compared to a global standard of 2.3MB.
Format, compress, minify
One of the biggest issues on plus-sized websites is pictures. Unless they’re compressed and resized to suitable formats, they can be over-large and slow page speed to a crawl.
The solution to that problem explains itself – compress and resize – but less obvious fixes can be found in using the appropriate file type. The .png format makes files smaller if they’re in block coloring and simple – like infographics, illustrations and icons.
But for photographs, with a wide number of colors and much finer details, .png can compromise the quality of the image. You might consider using .jpg files instead, or .WebP, an open source image type format from Google, which supports both lossy and lossless compression.
Using correctly sized, unscaled images manually can be quite a daunting task for web developers. PageSpeed modules from Google can come in handy, automating many of the tasks necessary for site speed optimization.
You can also minify the source codes. CSS and JavaScript resources could be minified using tools like http://javascript-minifier.com/ and http://cssminifier.com/ – and should save kilobytes otherwise spent on whitespace.
The HTML should be also as compact as possible. We recommend stripping out all the unnecessary whitespace and empty lines.
Time to go mobile
Not very responsive
Most retailers in the study had mobile-optimized sites, but 24% of them served their mobile users a separate mobile site – usually on a separate sub domain. While this approach improves UX, it can be inconvenient for two reasons:
1)    Google handles subdomains as separate domains.
2)    Depending on how the redirects based on viewport are set up, in the new, mobile-first index world, this can mean that the Googlebot (visiting with smartphone user agent) will have troubles reaching the desktop version of the site.
A safer solution can be to use a responsive site that delivers the same HTML code to all devices, but adapts to the size and shape of the device used. We found that this had representation on only 76% of the sites.
Alarmingly, mobile sites themselves were largely poorly-optimized for mobile; the average mobile site scored 53.9/100 for speed, as opposed to the average desktop score of 59.4/100.
Hewlett Packard had a massive difference of 29 points between their desktop score (at 77/100) and their mobile (48/100), while the worst offenders were Carat London, who scored zero for both mobile and desktop score.
Here is the list of the top 10 websites based on Google’s Page Speed Insights:
URL Desktop Score Mobile Score Total PageSpeed Score http://www.tmlewin.co.uk/ 97 95 192 http://www.ikea.com 95 88 183 http://www.vodafone.co.uk 83 99 182 http://www.findmeagift.co.uk/ 92 85 177 http://www.wynsors.com/ 89 88 177 http://www.sofasworld.co.uk/ 90 86 176 http://www.americangolf.co.uk/ 80 95 175 http://www.mulberry.com/ 88 86 174 http://www.worldstores.co.uk/ 89 85 174 https://forbiddenplanet.com/ 90 84 174
Mobile management
Much of the mobile optimization requires coding and/or web development skills, but worry not – Google have created a guide to delivering a mobile page in under a second.
AMP it up
AMP – Accelerated Mobile Pages – is Google’s initiative for producing more efficient webpages for mobile. It’s a work-in-progress, but every day brings new developments and more support, customization and stability.
AMP pages have a number of benefits for all sites, including being preferred by Google in search rankings, and being faster to load. For ecommerce it’s a technology to implement ASAP, or at least keep an eye on.
While AMP debuted for publishing sites, recent updates have brought ecommerce sites into the fold, and eBay in particular have been quick on the uptake, now serving over eight million pages through the AMP system. Google is also working with eBay on things like A/B testing and smart buttons.
“With items like these in place, AMP for ecommerce will soon start surfacing,” says Senthil Padmanabhan, the principal engineer of eBay.
Customization on ecommerce AMP pages is currently low, but it’s an ideal technology for the industry, allowing customers quicker transitions between products – improving conversion rates and making the website easy to use.
During testing on the websites in our study, AMP was found to have a 71% faster load speed for blog posts, and a reduced page size from 2.3MB to 632kB.
Onwards and upwards
Site speed isn’t a problem that’s going to go away. As time goes by, the technology improves – AMP and HTTP/2 are just the latest steps on the road to real-time loading. 5G is on the horizon, and customers are only going to become less patient with slow-loading pages.
As a result, it’s increasingly necessary to keep an eye on your site analytics and your customer behavior. A speed improvement of just one second can improve your conversion rate by 27% – and a delay of one second can cost you millions a year.
Make sure you’re on top of bringing your ecommerce business and site into the modern era with the tips I’ve listed here.
from IM Tips And Tricks https://searchenginewatch.com/2017/04/04/site-speed-tactics-in-a-mobile-first-world-why-ecommerce-brands-need-to-step-up-their-site-speed-game/
0 notes
sheilalmartinia ¡ 8 years ago
Text
Site speed tactics in a mobile-first world: Why ecommerce brands need to step up their site speed game
A study of 700 top ecommerce brands found that the majority are underperforming when it comes to optimizing their sites for speed. Find out how you can avoid the same mistakes.
Web users are not patient. The speed of your site can make a massive difference to whether people will visit your site, whether they’ll stay on it, and whether they will come back. Not to mention whether they’ll make a purchase.
A massive 79% of shoppers who have been disappointed by a site’s performance say that they’re less likely to use the site again. But what constitutes ‘disappointed’?
We’re only human after all
Kissmetrics research on customer reactions to site speed has resounded across the industry, but it’s not something that should be forgotten:
“If an e-commerce site is making $100,000 per day, a 1 second page delay could potentially cost you $2.5 million in lost sales every year.”
That’s a 7% reduction in your conversion rate, and 52% of customers say site speed is a big factor in their site loyalty. A one second delay is bad – a two second delay is worse. 47% of consumers expect web pages to load within two seconds.
But based on the same research, a faster full-site load leads to a 34% lower bounce rate, and an improvement by just one second results in a 27% conversion rate increase.
It’s because site speed is such a vital part of building a successful ecommerce site that my team and I conducted a study into 700 top UK ecommerce sites, analyzing various aspects of their site speed performance.
What we found is that the biggest brands have some of the poorest optimization, with outdated web protocols, unresponsive pages, and bloated page size.
The average web page size is now 2.3MB (that’s the size of the shareware version of the classic game Doom), so we wanted to see whether the ecommerce industry is any better – since their businesses are directly dependent on their website performance.
Surprisingly, we have found that the web page size of the top UK ecommerce sites is 30% larger on average than standard websites – at 2.98 MB.
Average webpage size according to HTTPArchive
However, the web page size isn’t the only factor impacting the site speed. Even larger sites load and render quickly if they’re smart about how they deliver.
My team and I picked the top 700 UK ecommerce sites, based on their estimated monthly traffic with data kindly supplied by SimilarWeb. For each, we analysed them using Google’s PageSpeed Insights API, checked their page size and loading time on Pingdom, and verified their HTTP protocol using https://http2.pro/.
From this, we found the following data, and used it to determine which websites are best optimised for speed:
PageSpeed Insights Desktop Score (not considering third party data)
PageSpeed Insights Mobile Score (not considering third party data)
HTTP/2
Web page size
Loading Time
Loading Time per MB
Desktop vs mobile
Mobile connections are usually slower than desktop to begin with, so further delays are felt even more keenly. This, together with the fact that Google’s latest mobile update now factors site speed into mobile page ranking, makes it a high value consideration to ensure mobile pages are sufficiently optimized.
This becomes even more of a consideration when you factor in how much of ecommerce traffic is now mobile – for example Vodafone, the third top-scoring website in our recent research, receives only 20% of their traffic from desktop, with 80% coming from mobile devices.
Make your site work for you
Your site speed isn’t simply a dial you can turn up in your page settings; there are a number of factors which contribute to it. here’s what they are, and how you can start making your site one of the fastest out there.
Protocol power
HTTP/1.1 isn’t SPDY enough
Network protocols are the rules and standards that govern the end points in a telecommunication connection – how data is transmitted over the web. Common examples include IP – Internet Protocol – and HTTP – Hypertext Transfer Protocol.
The HTTP/1.1 protocol is decades old and doesn’t make full use of newer technologies. Its main downside is it doesn’t allow you to download files in parallel. For each file (or request), the server needs a separate connection.
HTTP/1.1 enables only one request per connection, while browsers now support a maximum of 6 connections per domain. This means that the number of files which can be downloaded and rendered simultaneously is limited – and that costs time.
Since the time of HTTP/1.1, Google has developed a newer version of the protocol, SPDY (“Speedy”), which allows simultaneous connections to be opened, and means it can serve multiple parts of the website (JavaScript, HTML, images, etc.) in parallel.
But SPDY isn’t the latest protocol developed by Google. Working closely with W3C (World Wide Web Consortium), they’ve developed the new HTTP/2 protocol. HTTP/2 has roughly the same characteristics as SPDY, but is also binary, and allows the server to ‘push’ information to the requester, with better HPACK compression.
Despite the clear advantages of the HTTP/2 protocol, only a few websites have made use of it. Our recent research discovered that only 7.87% of the top 700 ecommerce sites use the technology – compared to 11.3% of sites overall. Some examples of websites using HTTP/2 are https://www.vans.co.uk/, https://www.paperchase.co.uk/ or https://www.expedia.co.uk/.
According to Cloudflare.com, when they implemented HTTP/2, they saw customers’ average page load time nearly halved – from 9.07 seconds for HTTP/1.X falling to 4.27 seconds for HTTP/2. That’s a significant improvement in a key area of website efficiency.
However, HTTP/2 doesn’t solve everything, and in some cases the results can be disappointing. In our research, many websites achieved only very small speed gains in their loading times when served over HTTP/2 instead of HTTP/1.1.
Switching to HTTP/2 isn’t enough by itself – many websites fail to optimize for the change and lose out on the maximum speed gains.
Old-school techniques, such as domain sharding or sprites, can be counter-productive. And using huge CSS or JavaScript files where less than 10% of the rules and code is relevant to pages likely to be visited is a waste of both your user’s time and your server’s time.
Screenshot from Dareboost comparison analysis of Oliver Bonas’ loading performance
Even our own measurements showed that the average loading time per 1 MB for websites supporting HTTP/2 was 1.74s, compared to 1.44s for websites not supporting HTTP/2.
A nice example of a successful HTTP/2 optimisation is Paperchase, who saved a full second of time necessary to load their website, as is shown here:
Screenshot from Dareboost comparison analysis of Paperchase loading performance
How To Tackle Protocols – HTTP/2 and you
If you want to be at the forefront of network protocols – and at the top of the list of faster sites – get an HTTP/2 protocol in place.
While HTTP/2 only requires the server to support the new protocol (many now do, though Microsoft’s IIS has no plans yet), the browsers need a TLS connection. This means every connection over HTTP/2 will be safe and secure, adding an extra layer of security to the internet.
For more information on how you can get started with HTTP/2, have a look at the Kaizen in-depth article here.
It’s all about size
The smaller, the better
If you’re trying to get speed up, you need to keep size down. The less there is to move from the Internet to the user, the less time it takes.
As I mentioned earlier in this article, the ecommerce sites looked at in our study were bigger on average than other webpages out there – 30% bigger, at 2.98 MB, compared to a global standard of 2.3MB.
Format, compress, minify
One of the biggest issues on plus-sized websites is pictures. Unless they’re compressed and resized to suitable formats, they can be over-large and slow page speed to a crawl.
The solution to that problem explains itself – compress and resize – but less obvious fixes can be found in using the appropriate file type. The .png format makes files smaller if they’re in block coloring and simple – like infographics, illustrations and icons.
But for photographs, with a wide number of colors and much finer details, .png can compromise the quality of the image. You might consider using .jpg files instead, or .WebP, an open source image type format from Google, which supports both lossy and lossless compression.
Using correctly sized, unscaled images manually can be quite a daunting task for web developers. PageSpeed modules from Google can come in handy, automating many of the tasks necessary for site speed optimization.
You can also minify the source codes. CSS and JavaScript resources could be minified using tools like http://javascript-minifier.com/ and http://cssminifier.com/ – and should save kilobytes otherwise spent on whitespace.
The HTML should be also as compact as possible. We recommend stripping out all the unnecessary whitespace and empty lines.
Time to go mobile
Not very responsive
Most retailers in the study had mobile-optimized sites, but 24% of them served their mobile users a separate mobile site – usually on a separate sub domain. While this approach improves UX, it can be inconvenient for two reasons:
1)    Google handles subdomains as separate domains.
2)    Depending on how the redirects based on viewport are set up, in the new, mobile-first index world, this can mean that the Googlebot (visiting with smartphone user agent) will have troubles reaching the desktop version of the site.
A safer solution can be to use a responsive site that delivers the same HTML code to all devices, but adapts to the size and shape of the device used. We found that this had representation on only 76% of the sites.
Alarmingly, mobile sites themselves were largely poorly-optimized for mobile; the average mobile site scored 53.9/100 for speed, as opposed to the average desktop score of 59.4/100.
Hewlett Packard had a massive difference of 29 points between their desktop score (at 77/100) and their mobile (48/100), while the worst offenders were Carat London, who scored zero for both mobile and desktop score.
Here is the list of the top 10 websites based on Google’s Page Speed Insights:
URL Desktop Score Mobile Score Total PageSpeed Score http://www.tmlewin.co.uk/ 97 95 192 http://www.ikea.com 95 88 183 http://www.vodafone.co.uk 83 99 182 http://www.findmeagift.co.uk/ 92 85 177 http://www.wynsors.com/ 89 88 177 http://www.sofasworld.co.uk/ 90 86 176 http://www.americangolf.co.uk/ 80 95 175 http://www.mulberry.com/ 88 86 174 http://www.worldstores.co.uk/ 89 85 174 https://forbiddenplanet.com/ 90 84 174
Mobile management
Much of the mobile optimization requires coding and/or web development skills, but worry not – Google have created a guide to delivering a mobile page in under a second.
AMP it up
AMP – Accelerated Mobile Pages – is Google’s initiative for producing more efficient webpages for mobile. It’s a work-in-progress, but every day brings new developments and more support, customization and stability.
AMP pages have a number of benefits for all sites, including being preferred by Google in search rankings, and being faster to load. For ecommerce it’s a technology to implement ASAP, or at least keep an eye on.
While AMP debuted for publishing sites, recent updates have brought ecommerce sites into the fold, and eBay in particular have been quick on the uptake, now serving over eight million pages through the AMP system. Google is also working with eBay on things like A/B testing and smart buttons.
“With items like these in place, AMP for ecommerce will soon start surfacing,” says Senthil Padmanabhan, the principal engineer of eBay.
Customization on ecommerce AMP pages is currently low, but it’s an ideal technology for the industry, allowing customers quicker transitions between products – improving conversion rates and making the website easy to use.
During testing on the websites in our study, AMP was found to have a 71% faster load speed for blog posts, and a reduced page size from 2.3MB to 632kB.
Onwards and upwards
Site speed isn’t a problem that’s going to go away. As time goes by, the technology improves – AMP and HTTP/2 are just the latest steps on the road to real-time loading. 5G is on the horizon, and customers are only going to become less patient with slow-loading pages.
As a result, it’s increasingly necessary to keep an eye on your site analytics and your customer behavior. A speed improvement of just one second can improve your conversion rate by 27% – and a delay of one second can cost you millions a year.
Make sure you’re on top of bringing your ecommerce business and site into the modern era with the tips I’ve listed here.
from Search Engine Watch https://searchenginewatch.com/2017/04/04/site-speed-tactics-in-a-mobile-first-world-why-ecommerce-brands-need-to-step-up-their-site-speed-game/
0 notes