Apache MEMCACHED UDP Protection

Current a lot of sites blogging about memcached attacks on Servers here some details:

  • Memcached Servers need a installed and running Service called „memcached“
  • Websites need a php-plugin like php7.0-memcached to connect via API to the memcached Service
  • The Memcached Service uses a own Config File at debian /etc/memcached.conf
  • By default it MUST listen to localhost or socket
  • Admins MUST setup a FIREWALL like „ufw“ (iptables) and MUST check own Server for OPEN PORTS with nmap
  • The Problem is that Attackers can run Scripts against to your Server in a 10^6 Range like a BOTNET !! with ONE PC cause MEMCACHED supports this high count of REQUESTS without going down.
  • DO NEVER HOLD CONFIDENTIAL DATA ON WEBSERVERS!!!

Test to open Port using nmap Port Scan with UDP Option NOT TCP:

sudo nmap -sU -p 11211 www.myserver.xyz

If the scan echo this YOU MUST check or install a FIREWALL!:
Host is up (0.10s latency).
PORT      STATE         SERVICE
11211/udp open|filtered unknown

if Echo shows this you are safe:
PORT      STATE    SERVICE
11211/udp filtered unknown

check your current Apache PHP Modules:

$sudo php -m

if memcached listed, the php api is active time to check more..

check for memcached service:

$sudo dpkg -l |grep mem

is memcached listed the service is installed, then do:
$sudo ps aux|grep mem

if the echo shows:
memcache ... /usr/bin/memcached -m 64 -p 11211 -u memcache -l 127.0.0.1 -P /var/run/memcached/memcached.pid

the Service is active an listening..

Sample Config:
/etc/memcached.conf

# memcached default config file
# 2003 - Jay Bonci <jaybonci@debian.org>
# This configuration file is read by the start-memcached script provided as
# part of the Debian GNU/Linux distribution.

# Run memcached as a daemon. This command is implied, and is not needed for the
# daemon to run. See the README.Debian that comes with this package for more
# information.
-d

# Log memcached's output to /var/log/memcached
logfile /var/log/memcached.log

# Be verbose
-v

# Be even more verbose (print client commands as well)
-vv

# Start with a cap of 64 megs of memory. It's reasonable, and the daemon default
# Note that the daemon will grow to this size, but does not start out holding this much
# memory
-m 128

# Default connection port is 11211
-p 11211

# Run the daemon as root. The start-memcached will default to running as root if no
# -u command is present in this config file
-u memcache

# Specify which IP address to listen on. The default is to listen on all IP addresses
# This parameter is one of the only security measures that memcached has, so make sure
# it's listening on a firewalled interface.
-l 127.0.0.1

# Limit the number of simultaneous incoming connections. The daemon default is 1024
-c 300

# Lock down all paged memory. Consult with the README and homepage before you do this
# -k

# Return error when memory is exhausted (rather than removing items)
-M

# Maximize core file limit
# -r

# Use a pidfile
-P /var/run/memcached/memcached.pid

Setup Firewall (ufw):

$sudo apt-get install ufw
$sudo ufw allow 80/tcp
$sudo ufw allow 443/tcp
$sudo ufw enable

Retest with NMAP Port Scan your OPEN Ports! Do this monthly! Cause sometimes the Firewall can have unknown Problems!!

Check the Memcached Log at /var/log/memcached.log for Events

Apache: Count Visits on Console

If you use a Webserver like Apache, you can use a small script to Analyse your Logs. Create a analyse-web.sh Script with:

 $sudo nano /home/user/analyse-web.sh 

insert:
#!/bin/bash
cat /var/log/apache2/access.log | awk '{ print  }' | sort | uniq -c
exit 0

System Output:
1573  www.domain2.de
3568  www.domain3.de
..

If you change the „$1“ to other value like „$8“ you will get the count of touched files or folders! This shows you attacks on single Files by abnormal high counts! You can use cron to run it every 15Minutes and send it to mailbox of a user. And this way does not need a PHP Tool with special PHP rights like webalizer or else..

Apache: Analyse Logs Spam Bots

If you admin a Apache Webserver, you see often weekly thousand of visits a day on your Blogs.

Background:
These are no real users, this visits are made by Spam Bots in my Logs like Xovi.de or xovibot.net Bots!
On info pages this Company says Admins should disallow crawl by robots.txt, but they IGNORE the settings!
This x-guys is in my opinion against German Law „Datenschutz“.

"Mozilla/5.0 (compatible; XoviBot/2.0; +http://www.xovibot.net/)"

Solution:

  • On Linux Setup a Firewall like ufw and block these IP Ranges
  • To find out the IPs do:

$sudo cat /var/log/apache2/access.log|grep xovibot.net| awk '{ print  }' | sort | uniq -c | sort -n > x.log

  • Now read x.log with cat

     46 212.224.119.143
     52 185.53.44.101
     54 212.224.119.140
     59 185.53.44.104
     62 212.224.119.142
     71 185.53.44.102
     75 185.53.44.103
     80 185.53.44.67
     80 212.224.119.141
     83 185.53.44.68
     87 185.53.44.43
     87 185.53.44.69
     96 185.53.44.70
    106 185.53.44.73
    108 185.53.44.51
    110 185.53.44.74
    113 185.53.44.55
    116 185.53.44.45
    116 185.53.44.53
    120 185.53.44.56
    131 185.53.44.71
    132 185.53.44.97
    137 185.53.44.46
    137 212.224.119.144
    141 212.224.119.182
    142 185.53.44.47
    146 185.53.44.41
    150 185.53.44.93
    152 185.53.44.188
    152 185.53.44.203
    152 185.53.44.64
    152 185.53.44.99
    153 185.53.44.184
    154 185.53.44.181
    154 185.53.44.82
    155 212.224.119.139
    156 185.53.44.92
    158 185.53.44.160
    159 185.53.44.202
    160 185.53.44.177
    160 185.53.44.178
    161 185.53.44.175
    163 185.53.44.187
    165 185.53.44.186
    166 185.53.44.189
    168 185.53.44.200
    172 185.53.44.90
    173 185.53.44.159
    173 185.53.44.72
    175 185.53.44.98
    176 185.53.44.96
    177 185.53.44.149
    179 185.53.44.157
    179 185.53.44.183
    183 185.53.44.148
    185 185.53.44.158
    185 185.53.44.63
    186 185.53.44.152
    188 185.53.44.201
    191 185.53.44.176
    191 185.53.44.80
    193 185.53.44.61
    193 185.53.44.94
    202 185.53.44.62

  • And insert the IP ranges of them into the ufw settings by:

$sudo ufw insert 1 deny from 185.53.44.0/24 to any       # insert rule
$sudo service ufw force-reload                           # force update firewall
$sudo ufw status numbered                                # test status

  • Where the „insert 1“ is important cause ufw must see first the deny entry
  • Check the logs manual weekly again with the „cat“ filter.. Kick them out!
  • Remark: This Howto works with every bot entry! There are more Marketing Scan Bots on the net!

More Infos:
http://webrobots.de/xovibot/

Security: Webserver HTTPS with Self Signed Certificate Do it yourself in 5 Minutes!!

Today the Point of Security and encrypted Webserver Communication is rolling over every User who hosts own Websites on the Internet. Last decades HTTPS was only used by Online Login Pages like Shops and Banks to verify the Communication between a User PC and the Website. But after January 2015 the most Search Engines like google decides to force index of Websites with HTTPS Protocol. The Background is that a TLS encrypted Connection isn’t easy to track and to force „drive-by-load-Viruses“ to the Website Visitors.

Self Signed Certificate Sample
Self Signed Certificate Sample

But a lot of Webmasters of the Opensource Community were angry about this handling. Thats is not real problem if you won’t buy a SSL/TLS Certificate. Every Webmaster can create a self signed Certificate on his Webserver if he is able to login via ssh and to config the Webserver like Apache. Self signed Certificates are warned by the Webbrowsers at the only first view, but if the User wants to install the Certificate the Browser isn’t warning next visits!

The Search Engines like Google don’t check the trust of the Certificates by the robots and so your Site will be good placed on the Index like the last decades. The ONLY thing is that you MUST move all Files,Images, Internal Links and Bookmarks to „https://“  that the „LOCK“ of the Browser Dialog is „CLOSED“  and „GREEN“ like on the Picture .

Of course if you want, you can buy and install „Domain Name Trusted“ Certificates, but if you only host private Websites/Blogs you won’t really pay over 100$ per Year for the Certificates.

Advantages:

  • Secure Login to your Site/Blog
  • Encrypted Transfer of Data
  • Security for your Visitors
  • No Drive-BY-Loads
  • Less Content Stealing

You will remark next years that the internet will be moved to HTTPS!

To create a Certificate use „OPENSSL“ with this command,answere the Questions of the Script, later put the Certificates .crt and .key to /etc/ssl/.. and tell Apache to pull them there!

$sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout mysitename.key -out mysitename.crt