More often than not, the answer is 'no'.
We are hearing a lot of FUD about AI and machine learning and the need for more sophisticated detection capabilities. For our more mature customers, LogRhythm is providing these advanced machine-based detection and response capabilities.
However, the majority of use cases we see can be easily dealt with using standard scenario- or statistical-based behavioural analytics.
On hosts the trouble often starts with the creation of an account, preferably one with some advanced privileges. If you have an admin group that is solely responsible for creating accounts and an account is created by a non member of that group, the new account may be a concern.
You could respond by automatically or manually (via an approval tree) disabling or deleting the account before any further activity takes place. I know it sounds a little simple and boring, but it's easy to implement and very effective.
It might not be an account, it may be a new process, a new registry setting or a new file. A similar detection and response methodology will suffice.
Decrypted Telegram bot chatter was found to actually be a new Windows malware, dubbed GoodSender, which uses the messenger platform to listen and wait for commands. Forcepoint researchers discovered what it described as a “fairly simple” year old malware that creates a new administrator account that enables remote desktop once it infects a victim’s device.