google https://gtmlabs.com Sun, 31 Aug 2025 03:20:59 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://gtmlabs.com/wp-content/uploads/2020/03/cropped-GTMLAB_FB180x180-32x32.png google https://gtmlabs.com 32 32 The end of HTTP is drawing near https://gtmlabs.com/the-end-of-http-is-drawing-near/?utm_source=rss&utm_medium=rss&utm_campaign=the-end-of-http-is-drawing-near https://gtmlabs.com/the-end-of-http-is-drawing-near/#respond Fri, 20 Mar 2020 04:02:00 +0000 https://gtmlabs.com/?p=224 In 2020, the end of HTTP is near, Google is taking aim at websites with mixed content.

Mixed content includes content downloads such as software executables, documents, and media files offered from secure HTTPS websites over insecure HTTP connections.

Mixed content resulted from the need for websites to move to HTTPS from mid-2018 when Google started flagging out sites that are insecurely transmitting information over its Chrome browser.

Users seeing the HTTPS padlock on a site in Chrome typically assume that any downloads it offers are also secure.

Google’s recent announcement points out:

Insecurely-downloaded files are a risk to users’ security and privacy.
For instance, insecurely downloaded programs can be swapped out for malware by attackers, and eavesdroppers can read users’ insecurely-downloaded bank statements.

To eliminate this issue, Google has recently announced a timetable for phasing out insecure file downloads in its Chrome browser.

It will be a gradual effort rather than an immediate hardline exercise. It will begin with Chrome on desktop version 81 due out next month, by offering warnings. The dateline for all downloads via HTTP will systematically be blocked by Chrome version 86 scheduled to be out in October 2020.

Mobile versions of Chrome will use the same timetable except that they will lag by a version behind their desktop counterpart.

This latest plan underlines Google’s desire to improve security and user experience by the promotion of HTTPS everywhere in Chrome.

Make note eradicating unsecured downloads doesn’t guarantee that the download isn’t malicious. In essence, it merely means that the download hasn’t been tampered with as it travels from the webserver to your computer.

As part of our web security offering, we can help scan and fix these mixed content issues.

Shall we have a conversation? 

]]>
https://gtmlabs.com/the-end-of-http-is-drawing-near/feed/ 0
Search engine blacklist – are you on them? https://gtmlabs.com/search-engine-blacklist/?utm_source=rss&utm_medium=rss&utm_campaign=search-engine-blacklist https://gtmlabs.com/search-engine-blacklist/#respond Fri, 21 Feb 2020 03:03:00 +0000 https://gtmlabs.com/?p=214 For businesses and individuals who rely heavily on online visibility, nothing feels more alarming than the possibility of landing on a search engine blacklist.

Once a site is blacklisted, its presence on search results can be drastically reduced or even removed altogether, cutting off traffic and credibility in an instant. For entrepreneurs, marketers, and content creators, understanding how these blacklists operate is not just a technical concern but a fundamental safeguard for protecting brand reputation.

What makes the issue of a search engine blacklist particularly challenging is that it often happens quietly, without immediate notice to the site owner. Many discover the problem only after experiencing a sharp drop in visitors or receiving alerts from users who encounter warnings before accessing the site.

By learning what triggers these blacklists and how to avoid them, website owners can shield themselves from unnecessary setbacks and maintain the digital trust that drives growth.

Why do search engine blacklist exist?

For a search engine, a good user experience is important. Protecting its users from harmful websites is one such good user experience requirement.

Websites that have been blacklisted will display a strong message such as “This site may be compromised“, or have a red screen enveloping the entire website.

If your website is on a search engines’ blacklist, not only do you lose organic growth, but more importantly, you lose both reputation and goodwill. All goodwill with the current domain will be lost; your website ranking will drastically drop on search engines. The business consequences can be devastating—lost sales, broken trust, and a long road to recovery.

In extreme cases, you might even have to rebrand your business and get a new domain name. All goodwill and organic traffic will be lost and you need to rebuild everything. That takes a lot of effort and time.

You might have to embark on a costly rebranding campaign to inform your audience of your new website.

How do websites get on the search engine’s blacklist?

There are various reasons why a site is blacklisted.  

Blackhat SEO tactics. For the purpose of ranking higher on Search Engine Result Pages (SERP), some SEO specialists might adopt practices to duped them to rank the page higher. Tactics like cloaking, excessive link exchanges are some of them. You can read more of some blackhat SEO tactics.

Spammy website. A site can be spammy due to malware putting out spammy content. This might result in unhappy site visitors and this doesn’t go down well with Google if the site gets discovered.

Website spreading malware. In some instances, your site could be used to download malware to unsuspecting users. Once this is picked up by the search engines, users will be shown a red screen to warn users from proceeding further. Your website will be quarantined and eventually blacklisted.

Website that plagiarizes. If a website is stealing content from other sites and making it it’s own. The artificial intelligence capabilities of the search engine can quickly and easily pick it up. Being blatant about it and you will soon get blacklisted too.

The sad and truthful part is that you might not even know that you have been banned by the search engines.

It has been reported that typically a malware could be residing on a website for 3 to 6 months without the website owner’s knowledge. Things start to surface when either you are notified by your web host or worst still by your customers and potential prospects.

It is not the search engine’s role to remove malicious code. That will fall onto the lap of the website owner. If you have Google Search Console in place, you could get a warning.

Take a proactive approach

Instead of a reactive approach, which is challenging and detrimental, we would recommend a proactive approach – have an alert mechanism, link your site up with Google Search Console, and implement an anti-malware solution.

We believe as a site owner; you should be the “first to know” of anything that is happening on your website (including the presence of malware) instead of the search engines or anyone else. News, especially negative ones, when made public, can be hard to manage and might spin out of control, resulting in negative publicity.

Reputation is priceless. Why risk it?

One way to avoid it is to deploy an alert mechanism. Once an alert is triggered, you can start your own internal investigation and quickly rectify and contain the problem.

Next is to have a good anti-malware solution to nip the problem at the bud. If anti-malware is in place on your site, it would be able to pick up the malware and tackle the malware before it can do its damage.

We can help you to get you ready and avoid trouble with the search engines.

If, however, you are already on the search engine’s blacklist, you can head to Google Search Console to take the necessary actions to remove the malware if any is present. Next, you would need to check on all login credentials and remove infected files manually. Or if you have an uninfected backup, you can restore it. Just confirm there is no malware in the backup.

Once you have taken the preliminary effort to make your website “good” again, you can then initiate a removal request through Google Search Console. These might take days.

Let us know if you need any assistance in the above area.

]]>
https://gtmlabs.com/search-engine-blacklist/feed/ 0
Mixed content and SSL https://gtmlabs.com/mixed-content-and-ssl/?utm_source=rss&utm_medium=rss&utm_campaign=mixed-content-and-ssl https://gtmlabs.com/mixed-content-and-ssl/#respond Mon, 20 Aug 2018 14:54:00 +0000 https://gtmlabs.com/?p=195 Mixed content is a security issue. It is part of a Content Security Policy (CSP).

For the longest time since the birth of the Internet, there was no requirement for websites to have the secure transport HTTPS protocol to display web content over browsers like Google Chrome. Traffic and data requests from sites without HTTPS were transmitting information in the clear.

However, with the rise of cyber criminals’ activities, increasing financial transactions, and issues around personal data, the need for secure transmission becomes critical.

With effect from July 2018, Google’s web browser, Google Chrome, started flagging out websites that are not HTTPS compliant. Google requires data and traffic information to be encrypted and transmitted from the browser to the web server and vice versa over HTTPS. In this manner, both the website and users will not be prone to an attack.

Website owners who handle transactions online started implementing SSL certificates to give their site visitors peace of mind that they are conducting their purchases on a secure site.

With privacy laws quickly being implemented by many countries, companies and organizations soon followed sue to enforce the HTTPS protocol to safeguard private data provided by their customers over the web.

Mixed content occurs for websites that were designed and uploaded under an HTTP URL and later converted to HTTPS via way of implementing a SSL certificate.

Mixed content is a security loophole. It exposes your web traffic during transmission.

Despite the HTTPS web link, some content on the website, such as videos, images, and scripts, are, however, still transmitting over the not secure HTTP connection.

Hence you have an issue of mixed content from HTTPs and HTTP, loading on a page.

Any data transmitting over the non-secure HTTP exposes the website to attacks through “man in the middle attack” techniques. By intercepting these unsecured transmissions, cybercriminals can now gain access to your data like login credentials and credit card details.

This mixed content issue must be quickly fixed to ensure ALL content is transmitted through the secure HTTPS protocol before a data breach occurs.

As a site owner, you want to fix this before it is too late.

As part of our web security offering, we can scan and fix these mixed content issues.

Shall we have a conversation?

.

]]>
https://gtmlabs.com/mixed-content-and-ssl/feed/ 0
Website download – a key search engine ranking factor https://gtmlabs.com/website-download-a-key-search-engine-ranking-factor/?utm_source=rss&utm_medium=rss&utm_campaign=website-download-a-key-search-engine-ranking-factor https://gtmlabs.com/website-download-a-key-search-engine-ranking-factor/#respond Fri, 19 Jan 2018 04:32:00 +0000 https://gtmlabs.com/?p=221 A website loading speed is not only important to its visitors, it is also a key ranking factor to search engines.

Google has been focusing on speed as a ranking factor for the last eight years.

In Jan 2018, Google formally announced that it would be updating its speed update algorithm sometime in mid-2018. This update will be a significant one. This early announcement gives website owners some heads up to start taking actions to improve their site downloading speed.

Why Is Google doing this?

The reason that Google is making this speed update is that speed matters to a user, especially as more and more users are surfing on mobile devices, which typically has a slower connection compared to desktops.

The user’s patience for a slow downloading site is waning. If a website takes too long to download, they merely move on to another website. Users expect to move through a brand’s site and pages at lightning speed. A web page that loads quickly will increase user satisfaction.

A key ranking factor

One of Google search’s key ranking factors is a site’s download speed.

So what does this mean to you, a website owner?

If your site is not downloading fast enough, visitors will abandon your site and move on to another. In time, your ranking on Google search engine page results will drop.

If your competitor’s webpages are downloading faster than yours, your site will rank lower than theirs. Hence you can see, speed matters.

What contributes to a website download speed?

Many factors determine the performance of a website.  They generally fall into two parts – front-end and back-end.

Front-end would refer to the webpages and all its associated resources, as rendered by a browser. The components on the front-end  that has an impact on the loading time would include the

  • HTML codes
  • CSS files
  • JavaScript
  • media  sizes (images, video, etc.) and
  • the use of caching
  • the use of redirects

The back-end would refer to the webserver delivering the page to your browser. Below are some of the

  • location of the webserver,
  • connection speed,
  • PHP version,
  • the number of HTTP requests etc.

Optimizing both front-end and back-end is critical in achieving optimal website performance and ranking higher on search engine result pages.

Let’s have a conversation if you would like to do a web page speed audit or improve your website speed.

]]>
https://gtmlabs.com/website-download-a-key-search-engine-ranking-factor/feed/ 0
3 important things you need to know when working with search engines https://gtmlabs.com/3-important-things-you-need-to-know-when-working-with-search-engines/?utm_source=rss&utm_medium=rss&utm_campaign=3-important-things-you-need-to-know-when-working-with-search-engines https://gtmlabs.com/3-important-things-you-need-to-know-when-working-with-search-engines/#respond Fri, 08 Apr 2016 04:42:00 +0000 https://gtmlabs.com/?p=375 To increase traffic to your website, you must first understand what drives search engines and how they go about achieving those goals.

The primary mission of the search engine is to create a good user experience for those who use their search capabilities. To achieve a satisfying experience, a search engine has to consider many factors.

However, these factors can be broken down into three broad categories.

Relevant content

Long before Google, there were already many other search engines on the Internet. Search engines like Alta Vista, ask Jeeves, Excite, etc. were household names back then lie how Google is today. However, while these companies understood the importance of search engines on the Internet, they did not understand the power of quality relevant content. Many a time, during those days, when you search on these platforms, the results that come back were disappointing.

Hence when a search engine (i.e., Google) that provided more relevant content came along, users migrated to the new platform. Users want quality relatable results. Being able to provide appropriate quality results is vital. If you do it right, your visitors will come.

Quality results, in turn, depend on the right keywords and relevant content. With good content, visitors will spend more time on the webpage, and they might even share the page out. These actions are signals to search engines that the webpage has some relevance to the user and potential users.

Download speed

With more and more users using their mobile users to surf the Internet on the go, download speed becomes increasingly important.

Download speed over mobile devices is typically slower compared to devices physically connected (i.e.,LAN) to the Internet. With the already slow connection, any webpage that adds to the slowness of the download will irk the user. This slow download would constitute a poor experience in the eyes of search engines. The website will be penalized by being ranked lower to a comparatively faster website. Hence, optimizing a website for faster download is critical.

It is imperative to check on your website download speed from time to time, especially when adding something new to it. The newly added image, functionality, may impact your site download performance.

Secure website

As cybercriminal activities intensify, search engines are increasingly putting more emphasis on website security. A new, more secure transmission protocol emerged because of this concern – HTTPS. With HTTPS in place, information that is transmitted between the web server and the client desktop or device is made more secure through encryption. Sites that run the HTTPS protocol have their domain name preceded by HTTPS in their URL.

In 2018, search engines like Google took it a step further by displaying a little padlock beside the URL of the website. Google even went further by providing warnings of unsecured webpages they are about to visit, to its users.

While doing all of the above three things doesn’t guarantee you will appear on the first page on any search engine result page, it will, however, provide you with the assurance that you are in line with the search engines’ mandate.

With over 100+ ranking factors to optimize, getting the above right will serve as a good foundation for your optimization effort to rank higher within the search engines.

]]>
https://gtmlabs.com/3-important-things-you-need-to-know-when-working-with-search-engines/feed/ 0