website security https://gtmlabs.com Sun, 31 Aug 2025 03:21:53 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://gtmlabs.com/wp-content/uploads/2020/03/cropped-GTMLAB_FB180x180-32x32.png website security https://gtmlabs.com 32 32 6 things companies can do during Covid19 https://gtmlabs.com/6-things-companies-can-do-during-covid19/?utm_source=rss&utm_medium=rss&utm_campaign=6-things-companies-can-do-during-covid19 https://gtmlabs.com/6-things-companies-can-do-during-covid19/#respond Fri, 03 Apr 2020 04:35:00 +0000 https://gtmlabs.com/?p=370 The Covid19 outbreak is a black swan event – an event that no one expected.

As it spreads across the world, governments are implementing measures to “flatten the curve”. As such, companies are instructed to allow employees to work from home.

During this time, face to face meetings is also kept to a minimum to starve the virus of new hosts. Remote meetings via webinars are now the preferred mode of any communications.

While we do our part to “flatten the curve”, we must not forget about the cybercriminals. They are still around despite the broader threat of Covid19.  

Companies can essentially do a few things to improve their position. They could modernize things, enhance the way of working, better protect themselves, and get ready for an uptick in business when the virus passes.

#1 Improve connectivity security

With employees working remotely, companies would want to improve their cyber defenses. Companies should consider deploying a firewall and VPN solution if they have not already done so. A VPN will prevent any “man in the middle” attacks.

#2 Improve endpoint security

Companies should also consider deploying an anti-malware solution on all remote machines in the event a cyber attacker is successful in deploying a malware payload. With an anti-malware solution in place, the malware can be detected and isolated.

Any anti-malware should also include an Endpoint Detection and Reporting component (EDR). With the added EDR functionality, companies can identify the origin of the attack the path it took, and the systems affected. This information will facilitate the investigative work of forensics experts. It will also reduce the financial cost of involving these experts, as their expertise is not cheap.

#3 Minimizing information loss

Finally, if companies are afraid of sensitive information being leaked, we can help in preventing that from happening.

If there is a data breach and sensitive personal data is leaked to the public, there will be an investigation by regulators. Pending the outcome of the findings, there would be a fine by the regulators. The size of the penalty will depend on the effort made by the company to improve its process in handling personal data, and what measures and controls they have put in place to minimize a data breach.

#4 Employee cyber awareness training

Beyond implementing solutions, companies could also explore running training programs to empower their employees.

One could be in the are of cyber defense and get ready to dash out of the gates when the economy picks up.

We provide cybersecurity employee awareness to empower employees against a cyber attack. We will convert them from being unwitting victims to alert defenders against any cyber onslaught.

#5 Sales training

For the sales team, we can also provide sales training to help them deliver better outcomes through better sales presentations and customer engagement.

Customers are getting savvier, and they expect to deal with a salesperson to be equally apted. Customers may be seeking a solution to solve a problem, but they eventually buy into the confidence of the service provider through the salesperson representing them.

#6 Lead generating website

A crisis is always an excellent time to slow down, review and improve on things (i.e., enhance internal processes, attend training, etc.) you usually never had the time to do.

With the impact of Covid19 weighing down on businesses, this would be a good time to modernize and sharpen the sword.

One of the things we can help with is your website. Your website has a bigger job to do rather than just looking pretty. We can not only help empower it to generate more leads but bring in more relevant leads.

Do drop us a note to see how we can make it happen for you.

]]>
https://gtmlabs.com/6-things-companies-can-do-during-covid19/feed/ 0
The end of HTTP is drawing near https://gtmlabs.com/the-end-of-http-is-drawing-near/?utm_source=rss&utm_medium=rss&utm_campaign=the-end-of-http-is-drawing-near https://gtmlabs.com/the-end-of-http-is-drawing-near/#respond Fri, 20 Mar 2020 04:02:00 +0000 https://gtmlabs.com/?p=224 In 2020, the end of HTTP is near, Google is taking aim at websites with mixed content.

Mixed content includes content downloads such as software executables, documents, and media files offered from secure HTTPS websites over insecure HTTP connections.

Mixed content resulted from the need for websites to move to HTTPS from mid-2018 when Google started flagging out sites that are insecurely transmitting information over its Chrome browser.

Users seeing the HTTPS padlock on a site in Chrome typically assume that any downloads it offers are also secure.

Google’s recent announcement points out:

Insecurely-downloaded files are a risk to users’ security and privacy.
For instance, insecurely downloaded programs can be swapped out for malware by attackers, and eavesdroppers can read users’ insecurely-downloaded bank statements.

To eliminate this issue, Google has recently announced a timetable for phasing out insecure file downloads in its Chrome browser.

It will be a gradual effort rather than an immediate hardline exercise. It will begin with Chrome on desktop version 81 due out next month, by offering warnings. The dateline for all downloads via HTTP will systematically be blocked by Chrome version 86 scheduled to be out in October 2020.

Mobile versions of Chrome will use the same timetable except that they will lag by a version behind their desktop counterpart.

This latest plan underlines Google’s desire to improve security and user experience by the promotion of HTTPS everywhere in Chrome.

Make note eradicating unsecured downloads doesn’t guarantee that the download isn’t malicious. In essence, it merely means that the download hasn’t been tampered with as it travels from the webserver to your computer.

As part of our web security offering, we can help scan and fix these mixed content issues.

Shall we have a conversation? 

]]>
https://gtmlabs.com/the-end-of-http-is-drawing-near/feed/ 0
Search engine blacklist – are you on them? https://gtmlabs.com/search-engine-blacklist/?utm_source=rss&utm_medium=rss&utm_campaign=search-engine-blacklist https://gtmlabs.com/search-engine-blacklist/#respond Fri, 21 Feb 2020 03:03:00 +0000 https://gtmlabs.com/?p=214 For businesses and individuals who rely heavily on online visibility, nothing feels more alarming than the possibility of landing on a search engine blacklist.

Once a site is blacklisted, its presence on search results can be drastically reduced or even removed altogether, cutting off traffic and credibility in an instant. For entrepreneurs, marketers, and content creators, understanding how these blacklists operate is not just a technical concern but a fundamental safeguard for protecting brand reputation.

What makes the issue of a search engine blacklist particularly challenging is that it often happens quietly, without immediate notice to the site owner. Many discover the problem only after experiencing a sharp drop in visitors or receiving alerts from users who encounter warnings before accessing the site.

By learning what triggers these blacklists and how to avoid them, website owners can shield themselves from unnecessary setbacks and maintain the digital trust that drives growth.

Why do search engine blacklist exist?

For a search engine, a good user experience is important. Protecting its users from harmful websites is one such good user experience requirement.

Websites that have been blacklisted will display a strong message such as “This site may be compromised“, or have a red screen enveloping the entire website.

If your website is on a search engines’ blacklist, not only do you lose organic growth, but more importantly, you lose both reputation and goodwill. All goodwill with the current domain will be lost; your website ranking will drastically drop on search engines. The business consequences can be devastating—lost sales, broken trust, and a long road to recovery.

In extreme cases, you might even have to rebrand your business and get a new domain name. All goodwill and organic traffic will be lost and you need to rebuild everything. That takes a lot of effort and time.

You might have to embark on a costly rebranding campaign to inform your audience of your new website.

How do websites get on the search engine’s blacklist?

There are various reasons why a site is blacklisted.  

Blackhat SEO tactics. For the purpose of ranking higher on Search Engine Result Pages (SERP), some SEO specialists might adopt practices to duped them to rank the page higher. Tactics like cloaking, excessive link exchanges are some of them. You can read more of some blackhat SEO tactics.

Spammy website. A site can be spammy due to malware putting out spammy content. This might result in unhappy site visitors and this doesn’t go down well with Google if the site gets discovered.

Website spreading malware. In some instances, your site could be used to download malware to unsuspecting users. Once this is picked up by the search engines, users will be shown a red screen to warn users from proceeding further. Your website will be quarantined and eventually blacklisted.

Website that plagiarizes. If a website is stealing content from other sites and making it it’s own. The artificial intelligence capabilities of the search engine can quickly and easily pick it up. Being blatant about it and you will soon get blacklisted too.

The sad and truthful part is that you might not even know that you have been banned by the search engines.

It has been reported that typically a malware could be residing on a website for 3 to 6 months without the website owner’s knowledge. Things start to surface when either you are notified by your web host or worst still by your customers and potential prospects.

It is not the search engine’s role to remove malicious code. That will fall onto the lap of the website owner. If you have Google Search Console in place, you could get a warning.

Take a proactive approach

Instead of a reactive approach, which is challenging and detrimental, we would recommend a proactive approach – have an alert mechanism, link your site up with Google Search Console, and implement an anti-malware solution.

We believe as a site owner; you should be the “first to know” of anything that is happening on your website (including the presence of malware) instead of the search engines or anyone else. News, especially negative ones, when made public, can be hard to manage and might spin out of control, resulting in negative publicity.

Reputation is priceless. Why risk it?

One way to avoid it is to deploy an alert mechanism. Once an alert is triggered, you can start your own internal investigation and quickly rectify and contain the problem.

Next is to have a good anti-malware solution to nip the problem at the bud. If anti-malware is in place on your site, it would be able to pick up the malware and tackle the malware before it can do its damage.

We can help you to get you ready and avoid trouble with the search engines.

If, however, you are already on the search engine’s blacklist, you can head to Google Search Console to take the necessary actions to remove the malware if any is present. Next, you would need to check on all login credentials and remove infected files manually. Or if you have an uninfected backup, you can restore it. Just confirm there is no malware in the backup.

Once you have taken the preliminary effort to make your website “good” again, you can then initiate a removal request through Google Search Console. These might take days.

Let us know if you need any assistance in the above area.

]]>
https://gtmlabs.com/search-engine-blacklist/feed/ 0
Mixed content and SSL https://gtmlabs.com/mixed-content-and-ssl/?utm_source=rss&utm_medium=rss&utm_campaign=mixed-content-and-ssl https://gtmlabs.com/mixed-content-and-ssl/#respond Mon, 20 Aug 2018 14:54:00 +0000 https://gtmlabs.com/?p=195 Mixed content is a security issue. It is part of a Content Security Policy (CSP).

For the longest time since the birth of the Internet, there was no requirement for websites to have the secure transport HTTPS protocol to display web content over browsers like Google Chrome. Traffic and data requests from sites without HTTPS were transmitting information in the clear.

However, with the rise of cyber criminals’ activities, increasing financial transactions, and issues around personal data, the need for secure transmission becomes critical.

With effect from July 2018, Google’s web browser, Google Chrome, started flagging out websites that are not HTTPS compliant. Google requires data and traffic information to be encrypted and transmitted from the browser to the web server and vice versa over HTTPS. In this manner, both the website and users will not be prone to an attack.

Website owners who handle transactions online started implementing SSL certificates to give their site visitors peace of mind that they are conducting their purchases on a secure site.

With privacy laws quickly being implemented by many countries, companies and organizations soon followed sue to enforce the HTTPS protocol to safeguard private data provided by their customers over the web.

Mixed content occurs for websites that were designed and uploaded under an HTTP URL and later converted to HTTPS via way of implementing a SSL certificate.

Mixed content is a security loophole. It exposes your web traffic during transmission.

Despite the HTTPS web link, some content on the website, such as videos, images, and scripts, are, however, still transmitting over the not secure HTTP connection.

Hence you have an issue of mixed content from HTTPs and HTTP, loading on a page.

Any data transmitting over the non-secure HTTP exposes the website to attacks through “man in the middle attack” techniques. By intercepting these unsecured transmissions, cybercriminals can now gain access to your data like login credentials and credit card details.

This mixed content issue must be quickly fixed to ensure ALL content is transmitted through the secure HTTPS protocol before a data breach occurs.

As a site owner, you want to fix this before it is too late.

As part of our web security offering, we can scan and fix these mixed content issues.

Shall we have a conversation?

.

]]>
https://gtmlabs.com/mixed-content-and-ssl/feed/ 0
Robots.txt- hiding sensitive pages on your website https://gtmlabs.com/hiding-sensitive-pages-on-your-website/?utm_source=rss&utm_medium=rss&utm_campaign=hiding-sensitive-pages-on-your-website https://gtmlabs.com/hiding-sensitive-pages-on-your-website/#respond Sat, 30 Dec 2017 07:55:00 +0000 https://gtmlabs.com/?p=346 Search engines are continually indexing the World Wide Web. They deploy efficient crawler programs to seek out webpages and index them for better search results.

However, there are some sensitive pages on a website that we recommend site owners not allow search engines to index and display as it could pose a security breach. One such weblink would be our Content Management System (CMS) login page.

Should a hacker finds out the link to your CMS login, he/she could try to brute force themselves into your CMS and take control of your website.

Fortunately, there is a way to ‘tell’ the search engines not to display these sensitive pages by way of a file, robots.txt. In the file, you can list the webpages you do not want search engines to index and make it discoverable.

The robots.txt file is essential to search engines too. A crawler bot from a search engine while indexing your website will also look for the robots.txt file on your site. They will take a peek into it, to see if there is any website for them to avoid displaying. If there is nothing in the robots.txt, they will, by default, make all pages discoverable.

Displaying a sensitive page to the wrong audience (i.e., hacker) could result in a hacker hacking into it, leading to a compromised site, something no human or search engine wants.

Hence search engines need your help to keep the Internet a safer place. They need site owners to specifically list webpages that they do not wish to be displayed.

Do reach out to us if you need any assistance in this area.

]]>
https://gtmlabs.com/hiding-sensitive-pages-on-your-website/feed/ 0
Audit logs – why you absolutely need them https://gtmlabs.com/audit-logs-why-you-absolutely-need-them/?utm_source=rss&utm_medium=rss&utm_campaign=audit-logs-why-you-absolutely-need-them https://gtmlabs.com/audit-logs-why-you-absolutely-need-them/#respond Fri, 06 Jan 2017 04:33:00 +0000 https://gtmlabs.com/?p=354 There are many Content Management System (CMS) out there in the market. You need to choose one which has a logging feature to serve as an audit trail.

Many activities are happening at the backend of a website – you will be making changes on your website, adding new capabilities, configuring them, doing back up, etc.  Though it may not seem important when you are building your website, however, when things go wrong, you want to be able to quickly understand what you have done on your website as things do and can go wrong from time to time.

Track changes

One of the primary use of a logging feature is to track changes. An audit log is something that will result from it. With an audit log, you will have a better understanding of what is happening on your website.

An audit log can advise on changes done, and some can also highlight the criticality of the changes.

Ideally, the audit log application should be able to send off an alert on these critical issues when it detects them.

Some audit log application can also send off an email alert if there is an abnormality (i.e., file size changes)

Multiple users

As more users work on your website, the complexity increases. The need for accountability increases. With a logging feature, you now know who has access to the backend of the website and what action they have taken.

Without an audit log, you wouldn’t know who has done what. Pinning down responsibility would be difficult.

Access by an external vendor

From time to time, you may encounter issues beyond your or your team’s capabilities. You need outside help.  You need to grant access to an external third-party vendor.

When you allow that, after the work is done, you want to have a peace of mind that they do not leave any backdoor application, which they can later use to gain access to the website.

Having this assurance is essential.  Otherwise, you will have sleepless nights. You will constantly be wondering whether you have made the right move in getting a third party to help out. You will be wondering have you solve a problem but created a greater problem of vulnerability that can be exploited anytime by the vendor.

Cyber incident

Audit logs are most useful and critical if there is a cyber-incident or data breach. In certain circumstances, an investigation is required by law. Intimate details of the incident – when it happens, how it happens, what path it took, what systems were involved, etc., would be needed to understand the scope of the damage and how to prevent it from ever occurring again.

The information provided by the audit log will be handy during this moment. It will significantly facilitate and shorten the investigation. It would also reduce the number of days for hiring a cyber-forensic specialist to conduct the investigation.

Think of an audit log like an in-vehicle camera. When something happens, the camera (i.e., audit log) will provide clarity of the incident. While sometimes, it may not give the full picture, but it does narrow things down somewhat.

Backup error

Surprise, surprise, but backups do fail too.

Failures could happen anytime when there is a software conflict within the CMS. When it happens, this might cause the last backup not to restore itself properly.

Your next course of action is to determine what has happened between the last and prior backup. You might want to know who has access to the system and what steps have they taken.

If you have been actively working on the CMS, you want to know what work or actions you have undertaken during the period between the last two backups.

By taking a snapshot of the activities before doing a backup, you will gain some understanding of what you have worked on.

That will save tons of guesswork from stretching your mind trying to recall those activities.

From the above scenarios, the benefits of deploying an audit log application are overwhelming. Its use becomes more critical when the site gets larger or when there are more users in your CMS.

]]>
https://gtmlabs.com/audit-logs-why-you-absolutely-need-them/feed/ 0
3 important things you need to know when working with search engines https://gtmlabs.com/3-important-things-you-need-to-know-when-working-with-search-engines/?utm_source=rss&utm_medium=rss&utm_campaign=3-important-things-you-need-to-know-when-working-with-search-engines https://gtmlabs.com/3-important-things-you-need-to-know-when-working-with-search-engines/#respond Fri, 08 Apr 2016 04:42:00 +0000 https://gtmlabs.com/?p=375 To increase traffic to your website, you must first understand what drives search engines and how they go about achieving those goals.

The primary mission of the search engine is to create a good user experience for those who use their search capabilities. To achieve a satisfying experience, a search engine has to consider many factors.

However, these factors can be broken down into three broad categories.

Relevant content

Long before Google, there were already many other search engines on the Internet. Search engines like Alta Vista, ask Jeeves, Excite, etc. were household names back then lie how Google is today. However, while these companies understood the importance of search engines on the Internet, they did not understand the power of quality relevant content. Many a time, during those days, when you search on these platforms, the results that come back were disappointing.

Hence when a search engine (i.e., Google) that provided more relevant content came along, users migrated to the new platform. Users want quality relatable results. Being able to provide appropriate quality results is vital. If you do it right, your visitors will come.

Quality results, in turn, depend on the right keywords and relevant content. With good content, visitors will spend more time on the webpage, and they might even share the page out. These actions are signals to search engines that the webpage has some relevance to the user and potential users.

Download speed

With more and more users using their mobile users to surf the Internet on the go, download speed becomes increasingly important.

Download speed over mobile devices is typically slower compared to devices physically connected (i.e.,LAN) to the Internet. With the already slow connection, any webpage that adds to the slowness of the download will irk the user. This slow download would constitute a poor experience in the eyes of search engines. The website will be penalized by being ranked lower to a comparatively faster website. Hence, optimizing a website for faster download is critical.

It is imperative to check on your website download speed from time to time, especially when adding something new to it. The newly added image, functionality, may impact your site download performance.

Secure website

As cybercriminal activities intensify, search engines are increasingly putting more emphasis on website security. A new, more secure transmission protocol emerged because of this concern – HTTPS. With HTTPS in place, information that is transmitted between the web server and the client desktop or device is made more secure through encryption. Sites that run the HTTPS protocol have their domain name preceded by HTTPS in their URL.

In 2018, search engines like Google took it a step further by displaying a little padlock beside the URL of the website. Google even went further by providing warnings of unsecured webpages they are about to visit, to its users.

While doing all of the above three things doesn’t guarantee you will appear on the first page on any search engine result page, it will, however, provide you with the assurance that you are in line with the search engines’ mandate.

With over 100+ ranking factors to optimize, getting the above right will serve as a good foundation for your optimization effort to rank higher within the search engines.

]]>
https://gtmlabs.com/3-important-things-you-need-to-know-when-working-with-search-engines/feed/ 0