Blog

Home / Articles  / Negotiating Load, Functionality and Efficiency in Website Design

Old and clunky websites can be a very frustrating experience, and understandably so for both users and developers, as site “usability” grows ever more complicated. With programming languages evolving, tech companies updating devices and married browsers, and the user attention span getting shorter and more tenuous, getting in front of change can be problematic as the urgency to search less and have more features available instantly at ones’ fingertips grows. This reality is going to continually shape whether programs (or domains) have legs to keep us coming back in perpetuity.

The topic of best design practices MUST be negotiated and re-negotiated periodically for the sake of longevity. And much of it doesn’t have to strain one’s budget. Quantifying and understanding how a site design is perceived, via user or bot, doesn’t have to be an exhaustive AB test or complete code overhaul. Standard optimization techniques like image optimizing, getting W3C compliant, and file inlining, combining or minifying can enhance the whole design and improve rank performance. Here is what needs to be considered.

Image Optimizations

There are many techniques to manipulate and dial down imagery to be responsive, crisp and lightweight. A fast load means a better experience, which translates to a number of other advantages. Not bothering to make your images web-friendly can result in the webpages looking microscopic on mobile, rendering slowly on spotty connections and expending more bandwidth than they should.

Compressing Images for Web

Reducing the amount of bytes an image is encoded with has measurable impacts on how quickly pages can load. Finding the “right” compression level takes a degree of finessing per each image as compressing too much can makes images look pixelated and unclear while not compressing enough would defeat the purpose. There is an abundance of online image compression services that offer free, bulk and premium services to get images the right dimensions and byte size for the web such as Fotoflexer, Optimizilla and Kraken.io. Grab a large image file and test out resizing and compressing it to see at what point quality starts to suffer.

Providing Image Dimensions

Leaving out image dimensions, or providing dimensions not specific to the image file, will mean a browser must reflow around them and repaint the image into the page once the image file downloads completely. Ensure featured images in the HTML and background images called in CSS contain the height and width. Also, providing alternate CSS versions for mobile layouts helps.

Caching Images

Caching resources, especially large image files, tells browsers how long to store and serve the same file before needing to download again from the server. This technique helps decrease the load time of visitors that leave the site and return at a later time.

CDNs

A content delivery network (CDN) is a collection of servers distributing your content to multiple locations so no matter where your user is browsing from, they can have regionally-based servers load the necessary files to speed overall rendering performance and protect site lag from traffic spikes. The cost of CDN networks might appear prohibitive for small and newer startup players, but there are actually plenty of CDNs that service your media based on calculating the rate of data transference during the month rather than a flat fee. Microsoft Azure, Akamai Technologies, and Rackspace are popular providers of CDNs.

File Optimizations

The three most common files you are going to have to negotiate in developing a leaner website are your JavaScript files (JS), HTML files (HTML), and Stylesheets (CSS). Each can be consolidated through minifying, combining or inlining with other files.

Minifying

Minifying files is the practice of reducing file sizes by omitting unnecessary bytes that derive from extra spaces, line breaks, indentation, comments, linefeeds and other confused strings. Software exists to support minification practices such as YUI Compressor and JSMIN.

Inlining & Combining

h

Small CSS and JS files will tend to reduce latency and load faster when their code and modules are inserted directly in the HTML. Inserting the larger 1000+ character styles and scripts in HTML would cause greater delays than not inlining. But those files can often cut down delay when portioned and combined into a single file rather than several outputs to be deployed.

W3C

pic

W3C is the World Wide Web Consortium which is basically a community that develops standards for coding the web. When one “validates” a website compliant to W3C, you are aligning code with the design practices set forth from that consortium. While these design practices are not imperatives for all design or for search engine rankings, these practices can expose how and where hypertext markup language breaks down or deviates from the best formatting to communicate to browsers and servers.

How do I check for W3C compliance?

Go to : https://validator.w3.org/ for HTML/xHTML and https://validator.w3.org/nu/ for HTML5. Here you can scan to see what bad values, warnings, missing elements, errors and fatal errors exist where browsers can find difficulty quickly parsing and rendering your markup. Common examples of errors are unclosed tags, straying tags, undefined attributes, unsupported attributes and bad values inserted.

Does W3C have any part of determining rankings?

No…but yes, in a way. The truth is while Google would never consider validation errors to rank websites, they will look at how usable and accessible your content is in relation to tags, semantic markup, loading speed and response errors which are all connected to appropriate formatting. Therefore, in the events where sites are seen relatively equal in information and reputation, a site with optimal W3C compliance will typically have leaner, more bug-free code to spider and rank over incompliant counterparts with errors, confused formats and missing attributes.

What resources are available to assist with optimizing and working through clunky, broken or incompliant code?

W3Schools is a popular location to brush up on styling techniques and the browsers that support them. They also have a forum where questions are often posed to their engaged community of web designers and developers. Also http://stackoverflow.com/ can put one in touch with a network of 4 million members that answer questions and post jobs and questions related to web programming.

If finding the ideal workaround to your design-related dilemma is still unrealized from either a performance perspective or browser compatibility, then there is still hope with a W3C task force called Houdini. Houdini is a group developing resolutions and features for the extensibility of CSS. Philip Walton of Smashing Magazine diagrammed nicely the efforts, both current and upcoming, to introduce features to each part of the website rendering pipeline.

chart

{source: Walton, Philip. “Houdini: Maybe The Most Exciting Development In CSS You’ve Never Heard Of.” Smashingmagazine.com. 24 Mar 2016. 20 May 2016.}

From these new specifications, developers will be getting access to other parts of the rendering pipeline to load scripts, create new constructs, provide new layout modules and include other custom properties. This way, it won’t be so taxing nor problematic relying on polyfills and other workarounds when browsers ignore or discard certain CSS rules and declarations. These newer capabilities are sure to breathe new life into many designer’s pages. Here is an example of a banner transition done through a Houdini worklet.

To follow or participate in Houdini’s process, Github is currently hosting their discussion forums and drafts. Follow along to see what features are going to make web browsing a much faster, more customized experience.

Post a Comment