VMware is the virtualisation leader. I have limited experience with VMware, and have mostly worked on Hyper-V, but I am not ignorant enough to defend Hyper-V yet. I think Hyper-V has its place, and I think is giving VMware some nice competition to ensure this segment does not get scale; Red Hat and Citrix are also not also-rans just yet either.
Hypervisors will continue to mature, but they will soon reach a point where their rate of growth in features will slow, and stability will be a core focus. Outside of the hypervisor, is the management systems, that create, deploy, maintain, manage, live migration/V-move, update and control the virtual machines running on the hosts.
With Microsoft’s latest beta release of System Center Virtual Machine Manager 2012, it has the ability to manage different hypervisors, deploying them to bare metal hardware. I think a single hypervisor environment will be a strategy for small players, with the big guys looking to maintain real HA VMS on VMware, and possibly less important virtual systems running across Hyper-V, Red Hat or Citrix. A number of VPS hosts already provide a number of hypervisor options to customers.
What you won’t find, is multi-management systems, and the key to the virtualisation war will be which vendor can offer a management system to control the data centre and the hypervisors. No one wants to control different systems in different places, and management systems are expensive whereas the hypervisors are free.
VMware have already started to look at XVP, and Veeam have also started to work on Hyper-V compatibility.
I also think Microsoft know: they CAN’T beat VMware in the hypervisor and its features. VMware are just too stable, too good, and too mainstream. I think Microsoft’s tactic will be to fight them in the management space, offering a management system that can control BOTH Hyper-V and VMware and Citrix. That way, they make up ground via licence costs in SCVMM, and hopefully push other tools such as SCCM and SCOM; maybe even Opalis.
This has turned in to a little bit of a topsy-turvy post, probably not making much sense – so I hope you can sort of make sense of it all. Just keep an eye out for VMware’s behaviour over the next few years. They have normally behaved in a sort of “we are the best, and don’t care about the rest”, however I expect this to change as they try to fight out a number of strong competitors.
Logged in to my WordPress site Dashboard to be greeted with a message that an update to WordPress is available. I host my own WordPress site, and with a working life in web hosting, I choose not to have FTP open as its just not safe with passwords sent in clear text. Yes, there is SFTP, but that is just a hassle for something I won’t use regularly and it means firewall ports need to be opened. I also just don’t trusted the stability of the FTP protocol, in terms of dropped connections and file locking.
When I need to access or update any files on my web servers, then I create a VPN session to the network and make the necessary changes. It is a lot more hassle to setup, but I havent had any post-installation issues and it just works. It is also makes me feel more secure when working. Surprisingly, from SA to UK, the connection is very stable and rarely drops – even when working over a WiFi connection.
Getting ack to my point, if I need to update WordPress then I have to do the following:
- Manually download the update zip file from WordPress.org
- Create a VPN connection to my web server network
- Create website file and database backups
- Upload the update zip file to the web server
- Unpack the update zip file
- Update WordPress
- Delete the updatez ip file and cleanup
- Close the VPN connection to my web server network
- Test my WordPress site and reactivate all my plugins
I fully understand that this process is of my own doing, as I host my own site, however, is it not possible for the WordPress site to have a form of update function in the website that downloads and updates files within itself. This would then allow the archaic FTP option to be removed for the good and security of the Internet and would save me from the work above.
I don’t quite know how this update function would work, but Windows Update and SVN could offer some interesting methods.
Perhaps each WordPress-specific (wp-*.php) page has a unique identifier at the top/bottom of the page, specifying the version of the page. A basic WGET/PHP scheduled task command could check for the latest files, by comparing the version number on each page against the latest version numbers within a XML file on WordPress.org. A simple WGET command could then download the necessary files.
The WordPress site would then routinely check all the file versions against an XML file of it’s own and run any local update scripts where necessary.
The above would be able to be either manual or automatic, and would aid in faster delivery of WordPress updates – especially those that are security related. Updates could be pushed out at night, and ready for activation first thing in the morning.
I have only limited developer skills, but the above doesn’t sound too out of place. Could it be done?
This concept video is produced by Corning, showing how a family will live through the day, interacting with futuristic screens, sensitive to sound, touch, weather, heat, visuals, and pretty much whatever your imagination can come up with.
This can’t some soon enough!
On any given morning, a look through my production Web server’s logs will show that my server farm is under a barrage of attacks. Hackers and crackers with automated IP port scanners can swamp a Web site with bogus requests and failed logons.The sheer volume of this traffic can reduce response times and overload service request logs. Failed logon attempts (sometimes several hundred in a minute) can obliterate legitimate security reporting in the event viewer. Even if the hacker never gains access to anything, your Web site suffers. I use several procedures to minimize the attack surface. But even after hardening the server and putting it behind a firewall, it is still vulnerable to attacks on port 80.How It WorksLegitimate users don’t normally go to a Web site by typing an IP address, but automated tools do. Humans use the domain name. The log files from Hackerbasher for the past two years prove this. The only nonhacker traffic to Hackerbasher has been the occasional request for an invalid URL or an unresponsive domain. I’ll explain why Hackerbasher gets these requests later.Setting Up Your Hackerbasher1. Open the Microsoft Management Console (MMC) with the IIS snap-in.2. Assign one host header (or several) to each Web site there so that no virtual server is mapped to an IP address on port 80 without a host header name. Unless you have a good reason not to do so, make sure that no Web server is using “All Unassigned” IP addresses (see below).I got the idea for Hackerbasher one morning back in 2002 while I was wading through endless IIS logs tracking a worm. I noticed that the hackers weren’t attacking the sites by their domain names but by their IP addresses. I was sure there was an automated tool out there systematically trolling through my IP pool looking for something listening on port 80. So, I thought, why not route all the IP:80 requests to a dead end in cyberspace? I then used host headers to do exactly that and called it Hackerbasher.3. Create a Web site that points to an empty directory (preferably not on the C: drive). You can use the standard defaults in the site creation wizard and call the site whatever you want. Remember, it doesn’t need a registered domain name since it won’t be listed in any DNS servers. Also, don’t install any server extensions like FrontPage® or SharePoint®.4. Once you have created the site, right-click on it and select Properties. Click the Directory Security tab and select Integrated Windows Authentication, then click OK. Be sure to uncheck Anonymous Access and Basic Authentication as shown in (see below).5. On the Web Site tab, click the Advanced button. Use the Add button on the Advanced Multiple Web Site Configuration window to select each IP address that you want to assign to Hackerbasher. For me, this is all the IP addresses that are visible to the public.6. Apply your changes and recheck your list to make sure that all your IP addresses are on it. If an IP address is already assigned to some other Web site, the MMC will give you an error message telling you there is a conflict. All you need to do is go back through the other Web site Identities and find the one(s) using an IP address on port 80 without a host header.7. On the Web Site tab, make sure that Enable Logging is checked; I use the W3C Extended Log File Format. Next, click the Properties button next to Active log Format and the Extended Logging Properties window will open.8. On the General Properties tab, select the log time period you prefer (I use Daily). Select the Extended Properties tab and then select the extended properties that you want to have appear in your log file. I check all of the extended properties, except Process Accounting.In this article, I will present an easily implemented strategy that uses HTTP 1.1 host headers to divert port 80 attacks away from unsecured public Web sites into a dead end where they can’t do damage. My site, called Hackerbasher, stops the automated attack and records the details about the attack along with the IP address used by the attacker. Hackerbasher doesn’t require any special software and its only cost is the time it takes to set it up on your server. You also get the added benefit of being able to monitor port 80 attacks in a single log file.Many of these attackers appear to be crackers/thrillseekers who simply want to break into something. Crackers usually sniff around for the obvious stuff such as unsecured databases and leftover developer sample files. Obviously, some attackers are on a mission to get in and do damage.The IIS log files for the Hackerbasher site will now fill up with a list of IP addresses that aren’t legitimate. There are a host of programs that you can use to pull these IP addresses in a firewall or back in to an IIS plugin that will block them in future so they don’t even get to your web server.
Hardening Your Web ServerThere are a number of procedures I typically follow in preparing one of my Web servers to go live on the Internet:
- Always keep security patches up to date. Applications to check include the server OS, IIS, SQL Server, FrontPage, Office, and SharePoint Team Services. I also notify my customers when I get new security bulletins.
- Run the Microsoft Baseline Analyzer tool on the server until all patches are complete and other exposures are minimized; then run the IIS Lockdown Tool and URLscan wherever possible.
- Enforce the use of role-based security and strong passwords on everything and everyone who can change anything on the server.
- All content sites are housed on a different hard drive than the OS and other key resources. Different customer’s sites are housed in separate unrelated directory structures. Disaster and recovery procedures should be in place and in practice for every server.
- All sample sites and unused sites (like the IIS admin and the default site) are removed or incapacitated. All unused applications and services are removed or disabled.
- The server is behind a firewall with all ports closed except the ones I use.
- Use host anonymization software like ServerMask from Port80Software. This hides the server’s identity, vendor, and version in the host header from malicious hackers.
- Proactively test customers’ applications to make sure that there are no obvious security holes. In addition to testing their applications from the browser, I have just discovered a new product for testing Web application vulnerabilities. GreenBlue Inspector lets me view request and response headers, cookies, and forms input. It also lets me test for buffer overrun vulnerabilities and SQL injection vulnerabilities, two of the most common security failures in Web applications. (See the Resources box at the end of this article and the Toolbox column in this issue.)
- Always keep a watchful eye on your server’s logs.
If you love IT systems, high availabilty and redundancy, then you will enjoy this video. In one way, some of the numbers are crazy and on other aspects, I did think to myself: “Is that it?”
Microsoft.com is a large and heavily visited site, yet it maintains high availability ratings because of a carefully planned infrastructure, team collaboration, and use of technology for maintenance, monitoring, and change management.
OK – I can’t seem to get the video to embed in the post, so I will just have to post the link.
I am slowly getting back to this topic, but specifically left it a while before doing another post as I wanted to have a think on some of my earlier concepts and apply some time to see if any of my views would change.
If you are playing catch up, then catch the introduction and the consumerism posts before heading on. I will now concentrate on the enterprise as this excites me a little more and can offer the same experiences as the consumer model – but with the enterprise acting as their OWN Provider.
Terminal Services has been around in Windows since NT4 and is nothing new. Virtual Desktops are beginning to become a more popular topic of discussion. Ultimately they would all evolve in to the same setup as the consumer model, but the enterprise powers and hosts their own grid in either a DC of their own (unlikely) or host it all on a hosted infrastructure such as Amazon’s EC or Microsoft’s Azure (more likely). The benefit of this is that it removes the need for onsite data centre managers, excessive power and cooling, the removal of hardware research, purchasing, maintenance and recycling. This can either aid in the reduction of a staff count or allows the existing IT staff to refocus their efforts to more appropriate tasks.
I am currently working on a SharePoint 2010 deployment, and have had some time to work with Office Web Apps; essentially, a web-based version of Microsoft Word, Excel, PowerPoint and OneNote that integrates in to your local SharePoint deployment. This now allows IT staff to centrally manage one version of Office that is deployed, updated and maintained in one place, rather than organising software deployments for staff members and managing multiple versions, updates and licensing. SharePoint also stores all documents in the back-end SQL Server databases, so there becomes a need for users to have access to local storage. Removing local storage, you then are able to increase your security levels as you begin to remove the need for users to make use of USB flash drives and other removable media that could bring malware in to the network, or be used to take sensitive material off the network. This also should then necessitate anti-virus and protection software to be only deployed on the servers, further reducing costs and management of Desktop-based anti-virus software.
However, all this do make server management a much riskier and stress-laden job, all the roles, security, processing and management tasks that were run on the Desktops are now placed on the servers – which already has roles, security, processing and management tasks of its own. Downtime no longer becomes an option and high availability will become a commodity rather than a luxury.
Working for a corporate company, that understands and appreciates the need for powerful software, I am lucky to be in a working environment where I can see the beginnings of back-end, server-based processing taking a lead and the power of the Desktop slowly diminishing.
Exchanging powerful Desktop PCs for cheap, low power dumb terminals should potentially see Linux finally make its long-predicted entrance as a main player to the desktop. All the dumb terminals need to run some form of base browser that is low on processing, and small in size; essentially an upgraded UEFI/BIOS. That’s Linux. This is actually something I think the Linux guys should seriously consider doing (before we get lumped with Windows CE!). They just will never beat Windows in the current environment. It just will not happen. So something needs to change for Linux to become more prominent. Either Windows has to change (for the worse) or the environment needs to change – and the environment will change. Both consumers and enterprise will need these dumb terminal PCs, laptops and slates and they will need a very basic low-end, underlying OS for basic tasks. Once those markets are cornered, then why stop at PCs? Move on to kitchen appliances, TVs, bathrooms.
Consumerism is now taking the mantle away from the Enterprise as the source of leading-edge tech, and I don’t see that changing. This model does pave away for less management and training as workers use their devices to connect and work instead of the business needing to buy it’s own equipment. This setup also reduces training costs and workers already know how to use their own stuff and therefore don’t need to be taught.
The ubiquity and reliability of the connectivity that we will have available to us will drive us from the grey, cubicle offices of today, to home or Starbucks of tomorrow. We will always be with devices, connected and available, allowing flexible working without hassles.
I think I have slightly gone off point a little from what I may have originally wanted to say, but I think the consumerism of the tech world and the growth of social networking is having a massive impact on the way we work, which in turn is having an impact on the way business leaders run their business.
My previous post opened up the development of what Internet-based services really are; over the non-descript hype that marketing people waffle on about. This post is an extension of the grid computing future, and how it will go on to affect the consumer.
To try and explain my prediction, think how you currently check your Hotmail emails: open a browser and navigate to www.hotmail.com. Supply your email address and password to log in to your account. You are now presented with a personalised mailbox environment, where your settings are omnipresent along with your email; no matter where in the world you login. You setup your signature, SPAM settings, rules and the settings are all saved by Hotmail and stored on their servers. Hotmail also take care of ensuring your mails are delivered and received, that any upgrades to the software are applied, and that your mail is backed up, and always available.
Now take Hotmail, and enhance it with your applications, photos, music, documents – everything and anything you currently do on your Desktop right now, but hosted in a data centre, on the Internet – but take it further than Google Services or Windows Live. An Online Desktop.
User hardware will change to accommodate the Online Desktop concept. With the shift of processing power moving from local resources, and on to Internet-based servers, this will have an effect on the hardware consumers will use. Desktop PCs and laptops will become stripped down to low-power dumb terminals, with minimal CPU power and memory but with high-end graphic capabilities. The lack of CPU and memory is because the O/S will run nothing more than a lightweight browser.
ANYWAY…let’s put this all in to practice so it all makes sense: power on your shiny new dumb terminal and you are instantly presented with a login screen, with have two text boxes and a drop down box:
- Text box 1 will be labelled “E-mail Address”
- Text box 2 will be labelled “Password”
- Drop down box will be labelled “Provider”
Enter your e-mail address and password, and select your Provider that your email account is associated with, e.g. Google, Microsoft, Facebook, etc., and you now are logged in to your Online Desktop. This isn’t an RDP session, the dumb terminal loads a Desktop environment in to the dumb terminal O/S window. It will look like the Desktop is running off the dumb terminal, but it will in fact all be running from a server a million miles away, over the Internet.
Now you have logged in, let’s say for example, you want to listen to music. You link your email address to iTunes, so when you open the web-based iTunes, you are auto-logged in and have access to all available music, all of which is streamed directly over the Internet. No more downloading. iTunes is then able to bill you based on a monthly subscription or pay as you use.
Let’s take photos: plug your camera in to the dumb terminal and the O/S detects USB activity and relays this to your online Desktop, which in turn auto-uploads all the images to your Flickr or Facebook account and auto-tags all the people in your pictures. However, most cameras will become smartphones with their own connectivity, making this exercise slightly redundant.
Films will be able to be streamed through a browser from Netflix as well as online gaming – hence the need for the additional oomph of graphic capability in the thin terminal. All of the graphical processing will be done on the servers, before being pushed to the dumb terminal. It will also take online, social gaming to a whole new level.
Document management will work in a similar fashion to Microsoft’s SharePoint, with a browser-based ECM/CRM, giving the ability to create and edit documents using Office Web Apps or Google Docs (depending on your Provider). Permissions can be applied to other users based on their email account, enhancing sharing and collaboration.
Your email account can be linked to Skype, MSN Messenger and Facebook, allowing you to access all of your friends via one channel – irrelevant of what IM service they choose to use. Webcams can certainly be built in to the dumb terminal monitors and provide sound and video for IM conversations.
The benefit of such a system is that you will always have access to your online Desktop, as it will be accessible anywhere in the world with Internet connectivity. Local storage, flash drives, DVDs, etc. will become redundant. Everything will become an on-demand, pay as you use service; which is what the real reflection of the Cloud is. For the consumer, this will provide flexibility in the following:
- No need to purchase or upgrade hardware and O/S
- Cheaper hardware, using less power
- No need to purchase, install and manage applications
- No need to purchase, install and manage security software
- No need to manage document backups and restores
- No more local storage and peripherals
- No more “I’m a Mac, and I’m a PC” adverts!
Your Online Desktop will be a sandboxed environment, managed, protected and hosted by your Provider. Now who doesn’t want that?
This is an idea that I have been talking about for some time (to whoever will listen), but want/need to get it down on paper before Google steal my thunder! The idea is based on “the Cloud” concept; I don’t like the phrase “the Cloud” as it is too easily bandied around by people who don’t fully understand what it means. I will try and revert to phrases like “Internet-based services”, or “the Grid” as they provide a much better description…and “Grid” just sounds cool!
Through work experience, I inadvertly managed to gain a true understanding of this latest shift in IT and modern technology, and reflect better on where it is headed. One of the clients was an importer of fancy goods. They had a warehouse in Waltham Abbey (Essex/London), where they received manufactured goods, and then distributed them to retailers – very similar to the wholesaler principal.
They had a Microsoft-based network, running Windows NT BackOffice 4.5 – the days of Windows NT 4 Server, Exchange 5.5, SQL Server 7.0, and IIS 4.0. Alongside this, they had an IBM RS/6000 C10, running IBM’s AIX 4.5 Unix O/S. On top of this ran their warehouse solution on Informix; containing their order processes, customers, suppliers, and invoicing. The C10 was an 80’s piece of hardware, with a SCSI card powering two 4GB SCSI disks, a 10/100 network card, and a DAT40 DDS tape drive. I dread to think how miniscule the CPU and RAM were.
All the office workers were supplied with a desktop PC, running a variety of Windows platforms, but would use telnet to connect and do they core work off the IBM C10. The C10 would do ALL the processing – the office workers would merely open sessions to the C10; similar to how users might open and RDP/Terminal session on a Windows Terminal Server. Any and all software, patches, upgrades, system changes were managed by us, and were only ever needed to be applied to the one C10 machine. Only the C10 ever needed to be backed up. The Windows desktop PCs acted as merely dumb terminals.
What is remarkable is that this is a system that was designed almost 30 years ago (…hence the blog post title), and yet it’s now a concept that’s being thrown around like it’s something new. Upgrade that original system to today’s world, and you can exchange the C10 for the latest blade systems, swap IBM’s AIX O/S for a clustered virtual server solution, and upgrade the local LAN to ADSL, 3G, Wi-Fi, etc.; all hosted in an offshore data centre with redundant power, connectivity, and cooling. For the user, the telnet session will be swapped for a web browser.
Now you have a system leveraging redundancy, accessibility, and sustainability to run your applications without the hassle of multiple instances of deployment and management. We will/should see an end to the deployment of local applications and a shift to Internet-based services, accessible anywhere and everywhere.
NOTE: The reason I mentioned getting this down before Google spoil it for me, is they recently released a Cloud based netbook where all its applications where Internet-based services and you couldn’t install anything on the netbook itself. The O/S was just a basic version of their future release of Chrome OS(?) and desktop widgets/browser shortcuts lead you to all your Internet-based services.
I had a conversation with a philisophical friend last week about social networking, the heavy influence the Internet plays in our lives and virtual reality. My sermon ended up going pretty wild on the new generation of social interaction via social media networks and how it could/would/will spiral wildly out of control to the point where we will never ever need to leave our homes.
Digital social interaction actually started out via the USENET newsgroups where people could discuss topics and share information – i.e. the founding “consumer” side of the Internet, outside of its DARPA/military based roots. Newgroups/NNTP have slowly evolved in to web-based forums, and newsgroups seem to be slowly falling away – I do know Microsoft are still keeping the concept alive with their Microsoft Connect site.
Then came email, which was pretty much a digital replacement for physical mail. I won’t waste your time with the history or relevance of email, other than for 2010, 1.79 trillion emails were sent – that’s alot.
Following on from that, came MySpace, Facebook and Twitter – there are ofcourse others, but they are the Big Three, as it were. Now 500 million people connect, like, tag, and update through Facebook. Twitter posted 25 billion tweets for 2010. I don’t have any MySpace stats (…just to annoy Rupert Murdoch should he ever read this) – does anyone even still care about MySpace?! In a bizarre way, these services have allowed us to connect to one another that would have been previously difficult or weird. I use those terms as it would be weird trying to look up ex-friends via the phone book and weird if you were on the receiving end of a random call from an old flame who phoned up just to say hi and find out what’s new. Now we can all stay in contact with one another via a cloud servicem that(I think) is making us lazy to properly interact, as it is so much easier to just “like” a friend’s status; which somehow classifies itself as interaction. Lazy is probably too heavier word, but Facebook is so much more convenient; and it’s free so I won’t waste any money making a phone call, or paying for a stamp. Think about what life was like before getting a driving licence: I was happy to use public transport or walk, as it was the best that was on offer. The moment I got a driving licence, I didn’t have “time” to wait for public transport or wait, I “needed” to drive now – driving was more convenient.
The cliche to such services is that as much as Facebook brings us closer together through a digital medium, it takes us further away from the true physical interaction that our human nature is built upon – merely because it is more convenient to pick up our phone or laptop and post a status update. It’s quicker, and requires less effort. I can relate to his, as I spent a year in industry, in 2003, whilst the majority of my friends were still studying at university. On a regualr basis, I would keep in touch via phone, SMS, MSN Messenger or lengthy emails. 2010, I am now in a different country, and I don’t do the above anymore as I can just quickly reply to a thread, see them in my News Feed, and viewed tag photos of their nights out - so I know what they are up to.
Social media removes the personal touch from communication. The same personal touch that you can’t get from physical interaction.
This is where we are now…now let’s try and project in to the future a little. How do we retain the benefits of digital/social interaction but add that personal touch that a web page can not deliver? I am sure the smartest guys at the big social media companies are probably mulling this over.
With such advancements in 3D and virtual reality, are we really that far from a future whereby we can plug on headgear that somehow interacts with our brains and drops us in to a matrix-style/Second Life VR world where we can interact?
The headgear in question actually connects to our brains, so if I drink something in a VR world, the headgear tells my brain to trick my physical body in to feeling a liquid going down my throat and tell my taste buds what it is supposed to taste like. In a VR world, I could decide what to wear and wear to go without taking my pyjamas off and leaving my home. Dropping in to a VR world, I could go to a club and drink lots of alcohol, but not get liver damage or actually drink and drive. I could enjoy the feeling of smoking without putting my physical body at risk. I could meet friends that live in different countries for a night out, but remain in separate countries. How amazing would that be? We could be anything we want, do anything we want….I just thought, we could actually go to work (which would be in a VR world) and not have to shower, shave, or even commute.
Now we can actually interact with one another, but still in the digital space. Tapping devices in to people’s brains is definately a high risk issue, but as a concept, it really isn’t that wild or out of reach.
Ironically, the concept is already starting if you check out the Google ArtProject:
Explore museums from around the world, discover and view hundreds of artworks at incredible zoom levels, and even create and share your own collection of masterpieces.
All of which can be done via browser. Why go to a museum if the musuem can come to you?
Digging through more of my theSpoke posts from years gone by, I came across one of my posts, describing the results of website game that determines what operating system is closest to your personality based on a number of scenario-based questions. Five years ago, and I was Windows 95….
Which OS Are You? (19/05/2005)
I saw this on another blog here on theSpoke where you fill in a few questions and it tells you what version of Windows you are most like…I am Windows 95! I can’t believe it says my communication skills are lacking although it says I look better than my older brother which means I am better than anyone set as Windows 98. I pity anyone who gets Windows ME, thats just shameful!
I find these “What are you..” quizzes quite intriguing, as we all judge ourselves by the pointless results, and wasted hours of time doing them and then emailing them on (or blogging even!) to others!
Well thats enough distractions, back to work.
The original blog post linking to the quiz’s website has since gone, but a little trawling and I found the original site. Well I had to retake the test: five years on, and I am now:
Well, the description seems a little closer to the truth, however Palm has gone down the pan, and been bought out by HP to compete in the mobile/tablet space.
The website has a few other quizzes:
- Which File Extension Are You?
- Which Nigerian Spammer Are You?
- Which Programming Language Are You?
- Which Website Are You?
I think I will save those for another sleepless night – definately looking forward to finding out which spammer I am.
A persistently tormenting person, force, or passion: The demon of drug addiction;
One who is extremely zealous, skillful, or diligent: Worked away like a demon;