It is important to protect your data (e.g. financial data, passwords) and yourself (e.g. succuming to fraud). If you run any network services then you want to protect those as well (e.g. hacking).
It is also helpful to make your network traffic efficient. For this article I am assuming that you are running a dual stack network (ala IPv4 and IPv6) and that your WAN connection is somewhat limited in bandwidth.
I am also assuming that you have your network correctly set up and the IP assignment, LAN segmenting, DNS and basic routing are all functioning correctly. What is not covered here is availability or reliability of the network and network services.
The following is a brief proposal for organization protection and efficiency tools:
- BLOCK: First step is to remove malicious targets outright at DNS (e.g. nextdns). Essentially this is IP blacklisting. For example this would be mostly for protecting users by blocking known bad destination IPs, but also blocking incoming port scanners, scraper bots, bad actor SEO bots, brute force hackers, etc. The next step is blocking at the transport level (e.g. firewall ports on your router and possibly your host as well as things like SNAT). Mostly this is port blocking (or whitelisting) incoming connections. We can also perform network level filtering of some optional targets, like blocking abusive ads or trackers.
- FILTER: Next step is to filter unwanted types of data. For example to block unknown (to the public) threats or increasing efficiency by blocking specific parts (e.g. web page ads) of an end users transfer. Typically this is accomplished by monitoring flows across your firewall or at your host (e.g. IDS/IPS like crowdsec or surricata) but can also be done on at the client (e.g. fail2ban or uBlock Origin). This prevents malicious targets from accessing your network, outgoing malicious targets from reaching the web and from users on the LAN accessing known malicious targets. This also includes bot networks (e.g. fail2ban). IDS/IPS frequently require a subscription or crowd source to keep up on a frequently shifting landscape. This is also a good place to provide network level filtering of optional targets (IDS/IPS) and application level filtering (e.g. uBlock Origin). Blocking and filtering have the secondary benefit of removing a chunk of traffic from the network. I do not filter ads myself but really bad or obnoxious targets are removed from my network.
- CACHE: After blocking and caching, the only remaining traffic should be the traffic you actually want or request. In this case it may be beneficial to cache some of this content locally which can dramatically increase perceived performance.
- SHAPE: Content that cannot be cached (e.g. live events like meetings and A/V streams) can benefit from prioritization or fair queuing. There is no security implication here however ordering traffic can lead to better or more predictable experiences on the network. For example it is convenient to prioritize video conferencing over streaming over bulk downloads or web browsing. Tools here are QoS or aggregating links if you have that option.
- REWRITE: Even legitimate traffic is sometimes subject to monitoring, monetization or abuse. In this case it is essential to rewrite this traffic. A good example of this is fingerprinting using network protocol or browser metadata. Tools here include fuzzing.
- MONITOR: The last step here is to monitor your setup for escapes. Stay on top of your network topology, 0 days, software patches and the internet landscape as well as seeing what part of your network is vulnerable or could use improvement. Tools here include logs, IDS and reporting.
What I would like to see are the further development and maturation of these technologies. What I would also like to see is hardware (and software) capable of L7 introspection. Filtering and blocking by application kind of works and is a bit hackish right now and it would be valuable to be able to block an application outright rather than trying to filter by port or guess domains. It would also be useful to fuzz traffic at the network level (e.g. confusing/scrambling trackers looking at patterns or web browser metadata or possibly even fuzzing data provided by externally accessible hosts).