Without Manual Tuning, it is not possible to get a faster WordPress with HTTPS on Cloud Server to get better ranking with good content. We undergone a breaking change of our architecture which practically completes our over 1.5 years work with schema.org markup, HTML5 and obviously a refreshing look. With a white looking website, it is probably possible to get visitors from the search engine, but it is very difficult to retain them. We paused publication over 72 hours since 2010 only for three times – when we was hacked (we were not on Rackspace at that time) – it was a 07 days break, second time we paused for 04 days when we shifted from Rackspace Cloud Sites to Rackspace Cloud Server (managed) and finally (at least for now), this time for 03 days. Why we wasted 03 days? Most importantly, why we are giving stress to the words – faster, HTTPS, Cloud Server and Content? Content is the King, but unfortunately, there are many ways to generate automated content fully optimized for Google crawlers. As with increasing penetration of Cloud Automation and Data Center controlled by Machine Learning, it is expected that Google will implement more frequent correction of false positives to catch spams. Spams, negatively hurts Google’s Business – quite obviously, if we are wrongly penalized, we can not invest for Google AdWords. The reason to possibly introduce automation is to avoid employee related issues. Most importantly, Google and Apple has differences with Facebook and Twitter. All names went associated with PRISM. It is quite difficult to rebrand Apple, but Google with diverse products seeking for a stable branding, wants to get rid of these Governmental and Political affairs which basically made them unreliable about privacy. These are some of the assumed business and technical reasons behind the newer policies.
Faster WordPress With HTTPS on Cloud Server – Why Cloud Server?
Dedicated Server and Bare Metals costs huge, unless the business is serious; it is not really possible to carry the cost of a true high performance Dedicated Server. There are web hosts which delivers “dedicated servers” – we discussed the technical part before – their lower cost can be due to 2 reasons – (1) The Network is NOT Dedicated or (2) It is actually a Ponzi Scheme – mostly they are virtual dedicated. Both, the dedicated and Bare Metal requires huge manpower on the data center, one physical machine easily catch fire on unexpected load or a buggy program.
Cloud Servers are easy and cost effective solution in a Pay as you Go model. However, for higher privacy and secrecy (which are needed for the financial websites, for example;), Public Cloud is never recommended as it is abstracted but is multi-tenant.
---
According to budget, Rackspace, Amazon, HP are good choices as server. Rackspace, in general is the best solution for the most good websites, additionally they have huge free resources. Rackspace is standalone, one need not to get another provider’s service for HTTP acceleration. Those who are on Rackspace Cloud Sites (we have some readers who loved to read our guides for Rackspace Cloud Sites) must initiate move towards Cloud Server.
Digital Ocean is not a bad option. However, there is no technical support and cost exactly not very less for scalable infrastructure. Fujitsu, Dell are for full enterprise setup where Rackspace might not provide support for the business part.
Faster WordPress With HTTPS on Cloud Server – How To Make it Faster?
It is better to follow Google’s advices for Page Speed Optimization minus the Page Speed Module for Nginx. You will not require the Module for Nginx + PHP5-FPM + LuaJit + MySQL with InnoDB Engine + Minimum 2 Server configuration for WordPress. Basically, blogs, most business websites now uses Plus, Like, Share etc. buttons which are delivered from others’ CDN. Although, compression can be done using CORS rules, but Javascripts will not follow your instruction. Even Google’s own webpage on page speed optimization has the problem with Javascript. Forcing them without knowing what can happen (or rather without A/B testing) – it is better to avoid Page Speed Module for Nginx. Page Speed Module for Nginx is for more complex setup where no alternative can be found. This is exactly what we are pointing towards :
1 | https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Fthecustomizewindows.com%2F2014%2F09%2Fdeliver-feeds-feedburner-ssl-enabled-website-via-cloud-files%2F&tab=desktop |
So, 07 rules has been passed with 87/100 score with not really a white, bad looking webpage :
- Avoid landing page redirects
- Eliminate render-blocking JavaScript and CSS in above-the-fold content
- Enable compression
- Minify CSS
- Minify HTML
- Prioritize visible content
- Reduce server response time
96 / 100 is mobile User Experience. It is quite obvious that, this guide is not for the noobs – like a SEO Expert who academically was a backbencher in class. Obviously a bigger business will require human resource in the next decade for web development, but mostly they are Front End designers and Server Side Programmers. Practically the story that was a rule 16 years back before people discovered Google Algorithm’s flaws – yes, CGI Scripting. Backlinks will be required but it is likely that, Paul-Angela’s method will work anymore. Google, this time approached in a dramatic way with multiple strategies calculated beforehand.
Keep one thing in mind – Barry Swartz might not know about technical things, but his website has good content. Quite obviously, his own writings will rank good. But, his methods / anyone’s method to fool will not work. Coming back to the score systems, again see this test :
1 | http://www.webpagetest.org/result/140922_T2_WQG/ |
95/100 on Web Page Test. Pingdom tests are not much reliable, it usually show falsely lower loading time. This Pingdom is used to promote page speed related products.
All A except Cache static content – which basically Google’s tool also pointed. Various browsers renders the pages differently. The methodology we used is that of Google’s. But on Opera, an user might need to wait for few milli-seconds with a CSS less HTML page. But it is better than if the Ads on top loaded when the user has scrolled down to the bottom – the user will face an odd feeling – “What I Saw?” after that, a likely check for another page. If things goes fine with – Mac Chrome, Mac Firefox, Mac Safari; it is unlikely to create a real trouble. Mac Opera is for testing maximum possible flaws. Mac, because Mac has Retina, that is the maximum possible resolution an user can have. Internet Explorer is a peculiar browser, except following the basics, there is not much to think – essentially a Windows XP user with IE 6 will face SSL Certificate related error to keep SSL certificate score to A+ :
1 | https://www.ssllabs.com/ssltest/analyze.html?d=thecustomizewindows.com |
Faster WordPress With HTTPS on Cloud Server – Let Us Have a Checklist
So, let us make a list apart from what Google suggested :
1 | http://googlewebmastercentral.blogspot.in/2014/08/https-as-ranking-signal.html |
- Rackspace Cloud Managed with minimum two server configuration – Application Server and Web Server, with increasing RAM and optimization of MySQL, it is possible to decrease the latency
- Rackspace Cloud Files or Akamai as CDN. Akami powers all the giants – from IBM to Microsoft.
- A good SSL certificate – either from GeoTrust, Symantec or Thawte. Thawte and GeoTrust has excellent practical support for re-issue things.
- Nginx compiled with LuaJIT, PHP5-FPM, their tweaks. All has been discussed. More will be discussed in future.
- MySQL with InnoDB Engine or may be PostgreSQL. PostgreSQL actually gives great performance but WordPress natively does not support.
- Proper security setup of the secured port for HTTPS deliver. In future, delivery via custom ports will be possible. Right now, except few providers, its quite difficult to setup HTTPS with custom port.
- W3 Total Cache
- Some way or Plugin to use Fragment Caching.
- Configure PHP5-FPM for caching properly. Varnish is not required if PHP5-FPM and Nginx is properly setup.
- XCache for Object Caching
- Inline all CSS. Google Plus essentially inline CSS. This is a thing which contradicts Yahoo’s age old rule.
- Avoid WordPress conditional PHP functions insertion via theme or plugin. It is better to use
functions.php
file for conditional stuffs. It helps W3TC to cache properly. - So, we are caching via W3TC, XCache, PHP5-FPM, WordPress Fragment Caching, optimizing delivery by LuaJIT. Which are outsourced are either set the header like for Rackspace Cloud Files / Akamai as CDN or set via CORS. Javascript will escape because they are essentially indented for real time happenings – like Javascripts of Ads, Analytics etc.
- As a rule of thumb – lesser images is better – so, use either sprites or Base64 Encoding. Base64 Encoding is our choice over sprite. But some stuffs are better as image – like website’s logo
- Set X Origin at CDN with your domain. Rackspace sets
mycloud.rackspace.com
as origin. As Cloud Files cost lesser, there can be abusers, so set it to your own domain as a part of CORS. - Get rid of forms on webpage hosted on same application server. This is a very big security loop hole when we are using so many types of Caching.
- Google yet have not said about TLS compression. TLS compression can be a point of vulnerability.
- Sell your website if you can not maintain the cost. Internet can not be a place to make others private stuffs public. Google is still softer about the rules, but expect hard words in the next 2 years.
- Blogging per se, will not be killed. Blogging can be started on free hosts like free PaaS or traditional Blogger, WordPress etc. third party managed hosts. Exactly that is how most of the successful bloggers started.
- Small businesses can use Ebay Shop, Google’s various SaaS services.
- Basically, it is kind of back to 16 years to have a sane, safe Internet. Not all users are web masters or have good knowledge about privacy but essentially they use mobile devices which can breech the fine line of privacy.
When you are using SSL, you have some insurance. This is a good point for sites which needs transaction. Keep one thing in mind – if Google introduces Machine Learning based algorithm, if you perform something wrong knowingly or unknowingly – it will point out the pattern and just ignore – specific to your domain, may be you.