Plainfiles and routing
Created: Last updated:
Routing, in simple terms, is the process of taking the URI and figuring out what the request is.
The router, doing the routing, only translates and transforms the URI into another format for other processes, though.
Depending on some configuration information the router attempts to link the parts in the URI path to some components, mainly a controller, an action and maybe a module. This way the request becomes a defined way understood by the framework.
The configuration in my framework is pretty basic and simple because it needs to work very dynamic. The main logic of linking the request to actual content is passed up the ladder and handled by other processes.
Faced with some other challenges I had to rethink the routing and more precisely the router configuration. After the change it has still the same simple dynamic flavor but a couple more tricks up its sleeves. It is now a tiny wee bit smaller and even more dynamic which allows for a few new features.
The main advantage and reason for the change is the processing of plain file requests. A couple weeks ago I created a new module to handle such files but I had to implement it as some sort of a last resort module. In other words, before we return an error "file not found" lets see if this is a request for a certain plain file.
It was still an improvement because such files had to be there physically and statically in the system; with the exception of the sitemap.xml file which is sort of dynamic in nature and already had its own module and routing information. The routing is now looking at all plain file requests and also funnels the sitemap.xml through this new module.
The new module is designed to handle all sorts of plain files which is pretty neat for several reasons. A plain file handled by the framework can be managed and also logged. The logging is particularly interesting because I can create reports easily. Now I see exactly who and when a certain type of file has been requested.
I currently manage three file types:
- the already mentioned sitemap.xml
- robots.txt for search bots and spiders
- plus the humans.txt initiated by the humanstxt.org group.
Of course, you can test and see my humans.txt
I currently have an eye on a few more but will add them at a later time.