Friday, February 1, 2019

Ubiquiti WiFiman

Ubiquiti offers a WiFiman mobile app that provides a suite of tools to monitor, analyze, and improve Wi-Fi performance.

https://blog.ubnt.com/2018/12/11/introducing-wifiman/

https://itunes.apple.com/us/app/ubiquiti-wifiman/id1385561119?mt=8

https://play.google.com/store/apps/details?id=com.ubnt.usurvey

OpenPhish

OpenPhish is a web site with a list of phishing URL’s.

https://openphish.com/

How to configure Squid on a Linux server

Below are some details of the configuration for a Squid service running on an Ubuntu server.  The server in question was using Ubuntu Server 18.04 as the operating system.
#
#  Determine network interface name
#
ifconfig
#
#  Set static IP address
#
sudo nano /etc/netplan/50-cloud-init.yaml
#
#  The default should be similar to the text below
#
network:
    ethernets:
        ens32:
            addresses: []
            dhcp4: true
    version: 2
#
#  Modify the file to resemble the following
#
network:
    ethernets:
        enp0s3:
   dhcp4: no
   dhcp6: no
            addresses: [192.168.99.99/24]
            gateway4: 192.168.99.1
            nameservers:
     addresses: [8.8.8.8]
    version: 2
#
#  Execute the following command to update and save the configuration
#
sudo netplan apply
#
#  If you are logged into remotely via SSH, your connection will drop
#
#  Once logged back in using the new static IP address, update the OS itself
#
sudo apt-get update
sudo apt-get upgrade
#
#  Disable IPv6
#
sudo nano /etc/sysctl.conf
#
#  Add the following lines
#
net.ipv6.conf.all.disable_ipv6 = 1
net.ipv6.conf.default.disable_ipv6 = 1
net.ipv6.conf.lo.disable_ipv6 = 1
#
#  Restart service
#
sudo service procps reload
#
#  Verify IPv6 is disabled; a "1" should be returned
#
cat /proc/sys/net/ipv6/conf/all/disable_ipv6
#
#  Install Squid
#
sudo apt-get install squid3 -y
#
#  Allow traffic to the listening port on the local firewall
#
sudo ufw allow 3128/tcp
#
#  Make a copy of the original default configuration file
#
sudo cp /etc/squid/squid.conf /etc/squid/squid.original
#
#  Create a text file with a list of domains to block
#
sudo nano /etc/squid/blacklist.txt

.google.com
.bing.com
.yahoo.com

#
#  Edit the configuration file for Squid
#
sudo nano /etc/squid/squid.conf
#
#  At the top of the file, add the line below to include more details within the logs
#
debug_options ALL,2
#
#  Use Control-W to find the text "http_access allow localhost"
#
#  Add a rule to block the domains from the text file created above
#
acl blocksitelist dstdomain "/etc/squid/blacklist.txt"
#
#  Add rules to block URL's that contain the text specified
#  This would block URL's such as google.co.uk
#
acl Yahoo url_regex -i yahoo
acl Google url_regex -i google
acl Bing url_regex -i bing
#
#  Specify the local subnet
#
acl localnet src 192.168.0.0/16
#
#  Add block rules
#
http_access deny blocksitelist
http_access deny Yahoo
http_access deny Google
http_access deny Bing
#
#  Allow the other traffic to pass
#
#  Change the default "http_access allow localhost" to the value below
#
http_access allow localnet
#
#  Use Control-W to find the text "dns_nameservers"
#
#  Configure local DNS servers by adding the following line
#
dns_nameservers 8.8.8.8 8.8.4.4
#
#  Use Control-W to find the text "cache_mgr"
#
#  Set email address that is returned on an error page by adding the following line
#
cache_mgr address@domain.com
#
#  Use Control-W to search for text "Safe_ports"
#
#  This would be used if an internal service used a custom port
#
#  Add port 8383 to the SSL_ports list and add a line below http 
#
acl SSL_ports port 443 83
acl Safe_ports port 80          # http
acl Safe_ports port 8383          
#
#  Use Control-W to search for the text "logfile_rotate"
#
#  Uncomment the line and change the default 0 to 5
#
#  The command to add a cron job is listed below
#
logfile_rotate 5
#
#  Save the configuration file and then use the command below to load the new parameters
#  Errors will be returned if found
#
sudo squid -k reconfigure
#
#  Another option is to restart the service
#
sudo service squid start

#  Logs are stored at /var/log/squid

#
#  To transfer log files to a Windows SMB share, install the smbclient application
#
sudo apt-get install smbclient
#
#  Make a copy of the log file to the user's home directory and change the permissions
#
sudo cp /var/log/squid/access.log /home/sam
sudo chmod 777 /home/sam/access.log
#
#  Use the smbclient to access the SMB share and transfer the file over
#
cd /home/sam
smbclient -m SMB2 -U 'server\user' \\\\192.168.x.x\\share
put access.log
#
#  Below are some examples of commands to review the log files with the converted time stamp
#
sudo perl -p -e 's/^([0-9]*)/"[".localtime($1)."]"/e' < /var/log/squid/access.log
sudo cat /var/log/squid/access.log | perl -p -e 's/^([0-9]*)/"[".localtime($1)."]"/e'
#
#  Below is an example to view denied traffic
#
sudo grep "DENIED" /var/log/squid/access.log
#
#  To rotate Squid's logs, use this command
#
sudo squid -k rotate
#
#  Use the commands below to add a cron job to rotate the logs at midnight
#
sudo crontab -e
0 0 * * * /usr/local/squid/bin/squid -k rotate
#
#  Create a shell script to combine the logs into one file, and then map to a SMB share
#
cd /home/sam
nano logcopy.sh
#
#  Copy the following lines
#
#!/bin/bash
cp /var/log/squid/access.log /home/sam
chmod 777 /home/sam/access.log
cp /var/log/squid/access.log.2 /home/sam
chmod 777 /home/sam/access.log.2
cp /var/log/squid/access.log.3 /home/sam
chmod 777 /home/sam/access.log.3
cp /var/log/squid/access.log.4 /home/sam
chmod 777 /home/sam/access.log.4
cd /home/sam
cat access.log access.log.2 access.log.3 access.log.4 > logs.txt
smbclient -m SMB2 -U 'domain\account' \\\\192.168.x.x\\share
#
#  Mark the file as executable
#
chmod 755 logcopy.sh
#
#  Execute the script with sudo.  Enter the AD user account password and use the "put logs.txt" command to copy the file to the SMB share 
#
sudo ./logcopy.sh
#
#  One method to determine if the default Squid error page was returned is to search within the logs for the following string
#
http://proxy:3128/squid-internal-static/icons/SN.png
#
#  Below is an example of where an URL was denied access
#
1545790661.113      1 192.168.254.215 TCP_DENIED/403 3970 GET http://www.nbcnews.com/ - HIER_NONE/- text/html
1545790661.168      0 192.168.254.215 TCP_MEM_HIT/200 11704 GET http://proxy:3128/squid-internal-static/icons/SN.png - HIER_NONE/- image/png
#
#  The information above was concerning using Squid version 3.  To use Squid version 4, the following steps were used via https://github.com/diladele/squid-ubuntu.
#  The domain source was in Germany, so the geo-blocking parameter required some exceptions to allow the traffic to pass.
#
#  add diladele apt key
#
wget -qO - http://packages.diladele.com/diladele_pub.asc | sudo apt-key add -
#
#  add repo
#
#  The original command below would return a "Permission denied" error:
#  echo "deb http://squid48.diladele.com/ubuntu/ bionic main" > /etc/apt/sources.list.d/squid48.diladele.com.list
#
#  Per a Google search, I used the following command to get around the permission error.
#
sudo su -c "echo 'deb http://squid48.diladele.com/ubuntu/ bionic main' >> /etc/apt/sources.list.d/squid48.diladele.com.list"
#
#  update the apt cache
#
sudo apt-get update
#
#  install the application
#
sudo apt-get install squid-common
sudo apt-get install squid 
sudo apt-get install squidclient
#
#   verify installed version
#
squid -v
#
#  Version 4 returned a warning concerning the original version 3 parameter of:
#  acl localnet src 192.168.0.0/16
#  So this was removed from the configuration file

Appscope

Appscope is a directory of Progressive Web Apps, showcasing the best PWA examples. All apps listed run entirely in the web browser and launch instantly without an app download.

https://appsco.pe/

Snorpy

Snorpy is a web base application to easily build Snort/Suricata rules in a graphical way. It is simple to use starting from the Action and Protocol fields and as you pick each field, the rule builder shows the rule in the bottom window.

http://snorpy.com/

https://isc.sans.edu/forums/diary/Snorpy+a+Web+Base+Tool+to+Build+SnortSuricata+Rules/24522/

MailStore Home

MailStore Home lets you archive your private email from almost any email source and search through them quickly.

https://www.mailstore.com/en/products/mailstore-home/

LOIC

LOIC is a network stress test utility.

https://sourceforge.net/projects/loic/

Tilix

Tilix is a tiling terminal emulator which uses the VTE GTK+ 3 widget library. It can display more than one terminal in the same window at the same time.

https://gnunn1.github.io/tilix-web/

https://www.omgubuntu.co.uk/2017/07/tilix-terminix-terminal-emulator-ubuntu