Here are some web trends for 2020:
Responsive web design in 2020 should be a given because every serious project that you create should look good and be completely usable on all devices. But there’s no need to over-complicate things.
Web Development in 2020: What Coding Tools You Should Learn article gives an overview of recommendations what you learn to become a web developer in 2020.
You might have seen Web 3.0 on some slides. What is the definition of web 3 we are talking about here?
There seems to be many different to choose from… Some claim that you need to blockchain the cloud IOT otherwise you’ll just get a stack overflow in the mainframe but I don’t agree on that.
Information on the web address bar will be reduced on some web browsers. With the release of Chrome 79, Google completes its goal of erasing www from the browser by no longer allowing Chrome users to automatically show the www trivial subdomain in the address bar.
You still should target to build quality web site and avoid the signs of a low-quality web site. Get good inspiration for your web site design.
Still a clear and logical structure is the first thing that needs to be turned over in mind before the work on the website gears up. The website structure for search robots is its internal links. The more links go to a page, the higher its priority within the website, and the more times the search engine crawls it.
You should upgrade your web site, but you need to do it sensibly and well. Remember that a site upgrade can ruin your search engine visibility if you do it badly. The biggest risk to your site getting free search engine visibility is site redesign. Bad technology selection can ruin the visibility of a new site months before launch. Many new sites built on JavaScript application frameworks do not benefit in any way from the new technologies. Before you go into this bandwagon, you should think critically about whether your site will benefit from the dynamic capabilities of these technologies more than they can damage your search engine visibility. Well built redirects can help you keep the most outbound links after site changes.
If you go to the JavaScript framework route on your web site, keep in mind that there are many to choose, and you need to choose carefully to find one that fits for your needs and is actively developed also in the future.
JavaScript survey: Devs love a bit of React, but Angular and Cordova declining. And you’re not alone… a chunk of pros also feel JS is ‘overly complex’
Keep in mind the recent changes on the video players and Google analytics. And for animated content keep in mind that GIF animations exists still as a potential tool to use.
Keep in mind the the security. There is a skill gap in security for many. I’m not going to say anything that anyone who runs a public-facing web server doesn’t already know: the majority of these automated blind requests are for WordPress directories and files. PHP exploits are a distant second. And there are many other things that are automatically attacked. Test your site with security scanners.
APIs now account for 40% of the attack surface for all web-enabled apps. OWASP has identified 10 areas where enterprises can lower that risk. There are many vulnerability scanning tools available. Check also How to prepare and use Docker for web pentest . Mozilla has a nice on-line tool for web site security scanning.
The slow death of Flash continues. If you still use Flash, say goodbye to it. Google says goodbye to Flash, will stop indexing Flash content in search.
Use HTTPS on your site because without it your site rating will drop on search engines visibility. It is nowadays easy to get HTTPS certificates.
Write good content and avoid publishing fake news on your site. Finland is winning the war on fake news. What it’s learned may be crucial to Western democracy,
Think to who you are aiming to your business web site to. Analyze who is your “true visitor” or “power user”. A true visitor is a visitor to a website who shows a genuine interest in the content of the site. True visitors are the people who should get more of your site and have the potential to increase the sales and impact of your business. The content that your business offers is intended to attract visitors who are interested in it. When they show their interest, they are also very likely to be the target group of the company.
Should you think of your content management system (CMS) choice? Flexibility, efficiency, better content creation: these are just some of the promised benefits of a new CMS. Here is How to convince your developers to change CMS.
Here are some fun for the end:
Did you know that if a spider creates a web at a place?
The place is called a website
Confession: How JavaScript was made.
2,357 Comments
Tomi Engdahl says:
Mikähän kliininen kielen patologia on kyseessä, kun klikkiotsikoinnissa kaikki uusi teknologia on aina pelottavaa, karmivaa, hyytävän outoa, tai näinpoispäin? Onko sille joku termikin? /iltalehti/
jäätävän pelottava ja osuva käsite, jonka eräs henkilö keksi! Katso tästä lisätiedot!!!!
Tomi Engdahl says:
Mä olen onneksi olen tehnyt hommia lähinnä teknologiapositiivisille medioille, niin uhkaotsikot on jääneet vähemmälle. Teknologiapositiivisessa journalismissa tekniikka nähdään oletusarvoisesti mahdollisuutena ellei ole hyvin painavia syitä että uhka pitäisi saada otsikkoon/ingressiin. Pienennätkin uhkat toki käsitellään tekstissä.
Tomi Engdahl says:
Vibe coding with AI tools: A marketer’s guide
You don’t need to write code to build great sites. Organize prompts, files, and workflows so AI can handle web development at scale.
https://searchengineland.com/vibe-coding-with-ai-tools-a-marketers-guide-455134
Tomi Engdahl says:
Set your content playbook on fire: Why the old SEO game is over
Search is changing fast. Zero-click answers, AI content, and solved queries threaten your strategy. Here’s how to future-proof your approach.
https://searchengineland.com/set-your-content-playbook-on-fire-why-the-old-seo-game-is-over-455218
Tomi Engdahl says:
The World Wide Web And The Death Of Graceful Degradation
https://hackaday.com/2025/05/20/the-world-wide-web-and-the-death-of-graceful-degradation/
In the early days of the World Wide Web – with the Year 2000 and the threat of a global collapse of society were still years away – the crafting of a website on the WWW was both special and increasingly more common. Courtesy of free hosting services popping up left and right in a landscape still mercifully devoid of today’s ‘social media’, the WWW’s democratizing influence allowed anyone to try their hands at web design. With varying results, as those of us who ventured into the Geocities wilds can attest to.
Back then we naturally had web standards, courtesy of the W3C, though Microsoft, Netscape, etc. tried to upstage each other with varying implementation levels (e.g. no iframes in Netscape 4.7) and various proprietary HTML and CSS tags. Most people were on dial-up or equivalently anemic internet connections, so designing a website could be a painful lesson in optimization and targeting the lowest common denominator.
This was also the era of graceful degradation, where us web designers had it hammered into our skulls that using and navigating a website should be possible even in a text-only browser like Lynx, w3m or antique browsers like IE 3.x. Fast-forward a few decades and today the inverse is true, where it is your responsibility as a website visitor to have the latest browser and fastest internet connection, or you may even be denied access.
What exactly happened to flip everything upside-down, and is this truly the WWW that we want?
User Vs Shinies
Back in the late 90s, early 2000s, a miserable WWW experience for the average user involved graphics-heavy websites that took literal minutes to load on a 56k dial-up connection. Add to this the occasional website owner who figured that using Flash or Java applets for part of, or an entire website was a brilliant idea, and had you sit through ten minutes (or more) of a loading sequence before being able to view anything.
Another contentious issue was that of the back- and forward buttons in the browser as the standard way to navigate. Using Flash or Java broke this, as did HTML framesets (and iframes), which not only made navigating websites a pain, but also made sharing links to a specific resource on a website impossible without serious hacks like offering special deep links and reloading that page within the frameset.
As much as web designers and developers felt the lure of New Shiny Tech to make a website pop, ultimately accessibility had to be key. Accessibility, through graceful degradation, meant that you could design a very shiny website using the latest CSS layout tricks (ditching table-based layouts for better or worse), but if a stylesheet or some Java- or VBScript stuff didn’t load, the user would still be able to read and navigate, at most in a HTML 1.x-like fashion. When you consider that HTML is literally just a document markup language, this makes a lot of sense.
It’s An App Now
Somewhere along the way, the idea of a website being an (interactive) document seems to have been dropped in favor of a the website instead being a ‘web application’, or web app for short. This is reflected in the countless JavaScript, ColdFusion, PHP, Ruby, Java and other frameworks for server and client side functionality. Rather than a document, a ‘web page’ is now the UI of the application, not unlike a graphical terminal. Even the WordPress editor in which this article was written is in effect just a web app that is in constant communication with the remote WordPress server.
This in itself is not a problem, as being able to do partial page refreshes rather than full on page reloads can save a lot of bandwidth and copious amounts of sanity with preserving page position and lack of flickering. What is however a problem is how there’s no real graceful degradation amidst all of this any more, mostly due to hard requirements for often bleeding edge features by these frameworks, especially in terms of JavaScript and CSS.
Wrong Focus
It’s quite the understatement to say that over the past decades, websites have changed. For us greybeards who were around to admire the nascent WWW, things seemed to move at a more gradual pace back then. Multimedia wasn’t everywhere yet, and there was no Google et al. pushing its own agenda along with Digital Restrictions Management (DRM) onto us internet users via the W3C, which resulted in the EFF resigning in protest.
Low-Fidelity Feature
Another unpleasant side-effect of web apps is that they force an increasing amount of JS code to be downloaded, compiled and ran. This contrasts with plain HTML and CSS pages that tend to be mere kilobytes in size in addition to any images. Back in The Olden Days™ browsers gave you the option to disable JavaScript, as the assumption was that JS wasn’t used for anything critical. These days if you try to browse with e.g. a JS blocking extension like NoScript, you’ll rapidly find that there’s zero consideration for this, and many sites will display just a white page because they rely on a JS-based stub to do the actual rendering of the page rather than the browser.
In this and earlier described scenarios the consequence is the same: you must be using the latest Chromium-based browser to use many sites, you will be using a lot of RAM and CPU for even basic pages, and forget about using retro- or alternative systems that do not support the latest encryption standards and certificates.
The latter is due to the removal of non-encrypted HTTP from many browsers, because for some reason downloading public information from HTTP and FTP sites without encrypting said public data is a massive security threat now, and the former is due to the frankly absurd amounts of JS, with the Task Manager feature in many browsers showing the resource usage per tab
Of these tabs, there is no way to reduce their resource usage, no ‘graceful degradation’ or low-fidelity mode, so that older systems as well as the average smart phone or tablet will struggle or simply keel over to keep up with the demands of the modern WWW, with even a basic page using more RAM than the average PC had installed by the late 90s.
In short, graceful degradation is mostly an issue of wanting to, rather than it being some kind of unsurmountable obstacle. It requires learning the same lessons as the folk back in the Flash and Java applet days had to: namely that your visitors don’t care how shiny your website, or how much you love the convoluted architecture and technologies behind it. At the end of the day your visitors Just Want Things to Work™, even if that means missing out on the latest variation of a Flash-based spinning widget or something similarly useless that isn’t content.
Tomi Engdahl says:
https://hackaday.com/2025/05/26/wayback-proxy-lets-your-browser-party-like-its-1999/