![]() I’ve used apache for almost 20-years now, but NGINX is much faster with PHP and has a lighter load on the server, which means less of a need to spin up additional servers and spend more money. Last year I switched finally from Apache to NGINX. That said it’s on my list for a rainy day and either way, the setup that I have right now allows me to withstand BlizzCon and patch days. ![]() When I have server techs working for me on projects I always do that, but directing the use and doing it yourself are two very different things. ![]() I could do A LOT better here and have an autoscaling system in-place, however, I haven’t really found the time for that yet. It takes me about 30-minutes to get this system up and running, and about 10-minutes to spin up another server to help distribute the load. When I anticipate that I’m going to need more power I have a round-robbin multiserver setup that I can turn on that will let me scale up as required. Right now we’re only running on one server with a couple of layers of caching involved (a page cache on the server, and Cloudflare - more on that later). I want to avoid a security incident if at all possible, so I’m doing my level best to achieve that. Everything is pretty speedy and I am constantly in there applying patches and keeping things up to date. We run on a Digital Ocean Linux VM with a large amount of memory, disk space, and cores. In layperson terms, that means I do everything from setting up the server to making text bold and switching out images. I’ll start from the bottom up because I’m responsible for the entire stack. I’ve answered this once or twice in the past I think, but that’s the past and things change constantly, so I think it’s about time to answer it again. What’s the tech behind Blizzard Watch? I know you have been able to withstand BlizzCon traffic over the last couple of years, what did you do to hold up?
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |