Skip to 0 minutes and 11 seconds Welcome everybody my name is Steven Rogers I’m the enterprise cloud and infrastructure architect for Coventry University IT services and today I’m just going to give you a bit of information about the past, present and future of network defence management in the university and generally. And I’m just gonna do a bit of a timeline go back about a decade 2009 to 2019 and then just show you how the technology’s changed. So I will start with a timeline and we’ll go back to 2009 and back then we used to deploy applications on what we called a one-to-one ratio so we used to have to buy a physical server platform and we would install one of those into our data centres.
Skip to 0 minutes and 55 seconds Very expensive systems and we would have a ratio of one to one. So what that means is we used to put the application on that particular server and it was used for nothing else. It was… in a lot of the cases the resource wasn’t used to the maximum, you know it was under-utilised, but it used to have to sit there running that particular role all of the time. So that happened quite a lot. We got to a point where data centers would have probably hundreds and hundreds of these physical servers just running web applications, databases, file servers, those sort of things. But as the technology begin to evolve we introduced virtualisation technology in around 2012.
Skip to 1 minute and 43 seconds What this enabled us to do is we could buy larger servers and install a piece of software on top of those servers which allowed us to get more out of that hardware so it used to be able to extract that through virtualisation and we could put many apps on those servers. We used to get a ratio of around 12 applications to one of those servers. So as you can see a lot of this physical environment started to reduce. Where we had hundreds before they started to become less, rooms needed, the floor space we used to take up on campus, it all got reduced. But we didn’t stop there, we continued to modernise, the virtualisation technology got better.
Skip to 2 minutes and 23 seconds This took us in to around 2015 and these servers got even larger and today we’re looking at a ratio of 80 to one. So where we have hundreds of applications and we support all areas of the university, you know these applications can come from everywhere research, they’re core business platforms, you know, it doesn’t really matter anymore because we’re having this such a high ratio of platforms. But, again we’re moving into a more modern world. So, where we are today is we link the University data centres to what you may have heard of as the cloud. Now this has a dedicated connection across to the cloud and we begin to deploy our services in both locations.
Skip to 3 minutes and 20 seconds So we can put a lot on these physical platforms still but we can have a capability where we don’t need this physical environment any more. We can start to use just compute offered by big players in the market such as Microsoft, Amazon.
Skip to 3 minutes and 35 seconds They have very different clouds: we have Amazon Web Services, we have Microsoft Azure. And we can begin to deploy these computers here so the ratio goes out of the equation, it’s literally just if you want to deploy lots of services, you can have web services over here, you can have file systems, and you don’t actually need any servers to deploy these capabilities in the cloud, it’s all just infrastructure as a service or platform as a service and you will build based on what you actually use. So in terms of how we protect this particular environment. If we go back to 2009 the firewalls that protect the university and any business are at a perimeter level.
Skip to 4 minutes and 27 seconds So, the firewall that protects us from the internet then has a series of other internal firewalls and then these protect those hosts. Whether or not they exist here or we begin to move them over to the cloud, all these hosts are protected by a variety of access controls. Now what that means is that we only allow certain ports from the internet from trusted sources or some public services that are allowed to connect straight through to these individual server platforms. So, in terms of this timeline we have firewalls that haven’t really changed up until about 2015. But the intelligence started to change in the firewalls too - they started to get what’s known as a next-generation firewall.
Skip to 5 minutes and 26 seconds And this happened for the University around here, 2017, and what it allows us to do is it brings in modern techniques in protecting this environment so where before we just had basic filtering of rules, ports, access control, we now can do full intrusion protection, we know if somebody’s trying to attack a particular interface. On systems we know the source, the country of origin, and the intelligence, like I was saying, is built directly into these firewalls.
The journey overview
In the video, we introduce you to Steve Rogers, who is an enterprise cloud and infrastructure architect at Coventry University.
Steve puts into context the development of network defence management over the past ten years – and highlights the role of firewalls.
Firewalls are an integral part of all computer network security solutions. Their basic function is to protect the local (trusted) network from an outside (untrusted) network, eg the internet. In doing so, the firewall monitors and analyses all incoming and outgoing traffic and reacts based on a predefined set of rules.
A firewall can be a very effective part of network security and in many cases, it is the first line of defence of a multi-layered approach. It can serve as part of a solution in implementing and enforcing the organisation’s security policy regarding traffic going to and from the internet.