Creating Archives from BackupPC

As I talked about in a previous post about BackupPC, it is a very powerful tool when is comes to doing self-hosted backups. The downside is when you want to archive out a machine. For example, you have backups of a host, but the host is long gone, and you just want to archive the data. Well there are 2 ways to go about this. Either you can use the web interface to create a restore.tar/zip file to download (which doesn’t always work, especially if done over the internet), or you can create the tar backup on the server, compress, md5, and download it using sftp. I like the second option. Mostly because I’m going through it right now. I have a backup server out in the cloud that I need to archive some 50 hosts from, so here is how I did it.

Simply log into the server and su to the backuppc user and go to where ever you want an archive.

/usr/share/backuppc/bin/BackupPC_tarCreate -h nameOfHost -n -1 -s '/home' / > ./home.tar

In the example above, I’m getting an archive of the home directory for host “nameOfHost”. You can do this for any backed up folder. Once done, you can create an md5sum of the file to help verify you got it downloaded right. You can also bzip2 the file and hopefully make it smaller. Even md5sum that one as well.

Either way, if is a great way to get very large archives created so you don’t have to go through the browser for everything. Feel free to script it, that’s what I did. I was able to start the archive and let it run over the weekend before downloading once the work week started again.

Did this command work for you? Did it not? What did work for you? Please let me know in the comments.

Use rclone to get dropbox working on linux again

A while back, Dropbox dropped a lot of support for Linux, such as dropping XFS and EncFS, which broke a lot of users. It ended up causing problems for me at work because we use CentOS and all of the sudden, Glibc is now too old to even run dropbox headless. Eventually I gave up on Dropbox and started just using it for simple things through the web browser, but then I discovered rclone.

Using rclone, I was not only able to view everything in Dropbox (which by the way, my company uses Okta for single sign-on, and this still worked) but I was able to mount Dropbox to my local file system! For those of you familiar with webdav, this works in a similar way. When you “mount” Dropbox it doesn’t download anything like when you use the app. It all works online. Put files into the mounted folder, and they will upload.

Getting started is pretty easy, the following commands were taken from https://rclone.org/dropbox/.

rclone config
n) New remote
d) Delete remote
q) Quit config
e/n/d/q> n
name> remote
Type of storage to configure.
Choose a number from below, or type in your own value
[snip]
XX / Dropbox
   \ "dropbox"
[snip]
Storage> dropbox
Dropbox App Key - leave blank normally.
app_key>
Dropbox App Secret - leave blank normally.
app_secret>
Edit advanced config? (y/n)
y) Yes
n) No
y/n> n
Remote config
Use auto config?
 * Say Y if not sure
 * Say N if you are working on a remote or headless machine
y) Yes
n) No
y/n> Y
If your browser doesn't open automatically go to the following link: http://127.0.0.1:53682/auth
Log in and authorize rclone for access
Waiting for code...
Got code
--------------------
[dropbox]
type = dropbox
token = {"access_token":"BIG LONG TOKEN HIDDEN","token_type":"bearer","expiry":"0001-01-01T00:00:00Z"}
--------------------
y) Yes this is OK
e) Edit this remote
d) Delete this remote
y/e/d> y
Current remotes:
 
Name                 Type
====                 ====
dropbox              dropbox
 
e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q>

So I kind of cheated here, but basically once you are getting setup a new link will your browser, log into Dropbox and it will ask for rclone to be able to access Dropbox. Give access and your done. It is pretty easy. Now I named my dropbox dropbox, maybe dropbox wasn’t the best name to differentiate it, but oh well.

Once you get to this point you can do something like

rclone ls dropbox:

Which will get you a nice list of files you currently have in your dropbox.

Now for the fun part… mount.

There is a ton of information over here at https://rclone.org/commands/rclone_mount/ but really, all you need to do is

rclone mount dropbox:/ /path/to/mount/point &

You must background the process to get your shell back. The mount is only active while that program is running and it does not appear in your list of mounted drives in Linux. So running something like df will not show the mount point, but what ever user you are logged in as (or that ran the command) will see files when looking in that directory.

Building OpenVAS in Slackware

I’m a huge fan of OpenVAS. It is a great tool for probing your network and finding possible security holes. Many of you have probably heard of Nessus, another fantastic tool, but it can be pretty pricey. I would recommend it for business, but for home use, go for OpenVAS.

In many cases, I would recommend you setup a Linux distribution called Kali Linux. It has a lot of really good tools built right in, including OpenVAS, but I’ve started running into issues with it lately. I’ll run a scan, and the systems load gets so high it becomes completely unresponsive for days at a time, then fails to finish. I’m not sure what I’m doing wrong there, so I decided to wipe the machine and put my good ‘ol Slackware back on it. After using it for several weeks I have decided to leave it Slackware as those issues have disappeared. So now I’m going to point you in the direction to get OpenVAS installed, plus a few extras that will make things easier.

I’m going to assume you are familiar with slackbuilds.org and hopefully a wonderful tool called sbopkg, as some wonderful people over there have build script for OpenVAS that will make your life so much better. Kent Fritz has written a great guide on how to get going over on slackbuilds.org. Go through his steps then come back here.

FYI, I have build and used OpenVAS on both 32 and 64bit Slackware and even on ARM using a Raspberry Pi. I’ve only had one program (hiredis) fail to build using sbopkg, so I had to do it the old fashioned way and download the build script and source and build outside sbopkg.

Note that while going through the instructions over on slackbuilds.org, before running any type of sync command, stop the running processes like openvasmd and openvassd. This is because the first time you run them, they will require a large amount of memory and will crash on the Raspberry Pi (I’m not sure on the pi2, I haven’t tried yet). By ensuring those processes are not running, it will surely finish properly.

Welcome back… I’m assuming you followed the instructions over on slackbuilds.org and are ready to continue. Here are some tips and script to make like just a little easier.

First, edit some permissions:

chmod 755 /etc/rc.d/rc.redis
chmod 755 /etc/rc.d/rc.openvassd
chmod 755 /etc/rc.d/rc.openvasmd
chmod 755 /etc/rc.d/rc.gsad

Now we are going to create a bunch of scripts that will simplify everything.

/usr/bin/openvas-start

#!/bin/bash
echo "Starting OpenVAS Services"
/etc/rc.d/rc.redis start
/etc/rc.d/rc.gsad start
/etc/rc.d/rc.openvassd start
/etc/rc.d/rc.openvasmd start

/usr/bin/openvas-stop

#!/bin/bash
echo "Stopping OpenVAS Services"
/etc/rc.d/rc.gsad stop
/etc/rc.d/rc.openvassd stop
/etc/rc.d/rc.openvasmd stop
/etc/rc.d/rc.redis stop

/usr/bin/openvas-feed-update

#!/bin/bash
echo "Updating OpenVAS Feeds"
echo "Stopping OpenVAS if running..."
/usr/bin/openvas-stop
openvas-nvt-sync
openvas-scapdata-sync
openvas-certdata-sync
echo "Rebuilding Database"
openvasmd --rebuild
echo "You can start OpenVAS now if needed"

/usr/bin/openvas-setup

#!/bin/bash
test -e /var/lib/openvas/CA/cacert.pem  || openvas-mkcert -q
if (openssl verify -CAfile /var/lib/openvas/CA/cacert.pem \
    /var/lib/openvas/CA/servercert.pem |grep -q ^error); then
    openvas-mkcert -q -f
fi
openvas-nvt-sync
openvas-scapdata-sync
openvas-certdata-sync
if ! test -e /var/lib/openvas/CA/clientcert.pem || \
    ! test -e /var/lib/openvas/private/CA/clientkey.pem; then
    openvas-mkcert-client -n -i
fi
if (openssl verify -CAfile /var/lib/openvas/CA/cacert.pem \
    /var/lib/openvas/CA/clientcert.pem |grep -q ^error); then
    openvas-mkcert-client -n -i
fi
/etc/rc.d/rc.openvasmd stop
/etc/rc.d/rc.openvassd stop
/etc/rc.d/rc.openvassd start
openvasmd --migrate
openvasmd --rebuild
/etc/rc.d/rc.openvassd stop
killall openvassd
sleep 15
/etc/rc.d/rc.openvassd start
/etc/rc.d/rc.openvasmd start
/etc/rc.d/rc.gsad restart
/etc/rc.d/rc.redis restart
if ! openvasmd --get-users | grep -q ^admin$ ; then
    openvasmd --create-user=admin
fi

Here is a great program that can help find any issues while getting setup. This link is mentioned in Kent’s instructions. So hopefully you have it already.

wget https://svn.wald.intevation.org/svn/openvas/trunk/tools/openvas-check-setup -o /usr/bin/openvas-check-setup

Here we are going to chmod those files:

chmod 755 /usr/bin/openvas-start
chmod 755 /usr/bin/openvas-stop
chmod 755 /usr/bin/openvas-feed-update
chmod 755 /usr/bin/openvas-setup
chmod 755 /usr/bin/openvas-check-setup

WOW! That is a lot! Alright, so several files have been created. Here is what each one does.
/usr/bin/openvas-start:
This will start all the services needed.
/usr/bin/openvas-stop:
This will stop all the services.
/usr/bin/openvas-feed-update:
This will update all your feeds.
/usr/bin/openvas-setup:
This script will help if you have any issues. Sometimes OpenVAS feeds cause an issue, and by running this command you will find it fixes the problem 99% of the time.
/usr/bin/openvas-check-setup:
This one will help you diagnose issues.

Give it time:
When starting OpenVAS, each part is thrown into the background to finish loading. Depending on your computers speed, it can take a while before you can do anything. Best to watch with top, htop, or iotop to see when everything has finished loading. Then proceed to use GreenBone.

Possible Issues:
When trying to log in to the GreenBone Security Assistant, You might get an error that says the OMP service could not be found. Try running the openvas-setup-check. If you get an error saying there are no users, run openvas-setup. This will fix it. This is a problem I have seen several times in the past on both Slackware and Kali, so I believe it to be a bug somewhere in OpenVAS.

I think that’s just about it. You should now be up and running with OpenVAS!

Fix BackupPC Not Getting All Your Windows Files

BackupPC is a fantastic tool for backing up all your machines. I use it to back up both Windows and Linux machines. Linux is easy, all you need is SSH and rsync, but Windows is kind of a pain. You need to use Windows shares in most every case. In the future, I’ll talk about how to use Cygwin to use SSH and rsync to backup a Windows machine.

The problem that I have, is there is a bug in Samba versions 3.6 to 4.1 that will cause the tarbackup function to stop the backup before it finished, and BackupPC will report the backup was complete. I haven’t run into this with every Windows machine, but I have in most. Generally what causes this is using another user account to login and perform the backups, instead of using the normal user account. If you backup a Windows machine using the smb method and it appears not everything is being backed up, then this is the guide you want to follow.

To start, I’m currently running Debian 7 (Wheezy) with Samba version 3.6. I tried getting Samba 4.2 to build, but several of my libraries are out of date. If you are currently running 4.0 or 4.1, you might be able to build 4.2 on your server. Otherwise, go with 3.5.22 (being the latest 3.5 series at the time of this writing). (https://bugzilla.samba.org/show_bug.cgi?id=10605)

There are several packages that need to be installed for this to work. Every config is different, but all I have to install was autoconf, make, and gcc.

apt-get install autoconf make gcc

Now we need to download the Samba sources and build it, but not install.

cd /opt
wget https://download.samba.org/pub/samba/stable/samba-3.5.22.tar.gz
tar -zxf samba-3.5.22.tar.gz
cd samba-3.5.22/source3/
./autogen.sh
./configure
make

That was the hard part, if Samba didn’t build correctly, you might be missing other packages. You maybe told what they are, otherwise, Google.

Now set the path for $SmbClientPath to /opt/samba-3.5.22/source3/bin/smbclient. You can either change the $SmbClientPath in your backuppc config, or just change it for hosts that are having issues. If you are reading this, I’m going to assume you know how to do that.

Now test (this will do a FULL backup, so it can take some time):

/usr/share/backuppc/bin/BackupPC_dump -v -f <computer name>

You can watch as it goes along (note that you will NOT see it running in the GUI). This can take some time, but when complete you will have an idea of if everything worked or not.

Did this work for you? Did you build Samba 4.2 or newer? Let me know in the comments.

My Triumph And Trials In The Removal of Windows Server From the Network

A Little History

I’ve been working for this small company for just over a year and a half. This is my first true system administration job. I’ve had several in the past where I did admin work part time, or on the side of my normal duties, but finally, after years of trying, I managed to land a great position. Most of my work consists of Linux servers, running everything from Ubuntu, to Debian, CentOS, and even a couple Slackware servers. Some are in house, some are in the cloud. Feels great to finally be doing what I love. There was just one problem… Windows. You see, many years ago my company was part of a much larger organization with hundreds of employees. This smaller company spun off and took with them 2 Windows Server 2003 systems. These server once did a lot of things. Managed E-Mail (Exchange), printers, file shares, Active Directory, VPN, and internal DNS/DHCP. Since I started, I’ve been fighting the AD (short for Active Directory) servers trying to keep it running (I should note, that I’m a pure Linux guy, I don’t know Windows hardly at all, and definitely not Windows Server). We also dropped down the number of users who actually used the AD. Most people in this newer, smaller company, all ran their own versions of Linux.

Damn Windows

A short time ago I informed management that our Windows Servers would not get updates for much longer, and we should plan on a migration. After talking to Microsoft and getting clarification on pricing, I found just to upgrade would cost somewhere about $3,000. While this is not much in company money, it also involves a lot of future headache for me having to keep them running, plus I’ve heard horror stories when dealing with the actual upgrade. I took it upon myself to get rid of this crap and just be done with it.

Now don’t get me wrong here. Sure, I’m not proficient with Windows Server, but I wouldn’t get rid of something that worked just fine. I’ve spent a lot of time keeping these servers running. They reboot at random times, half the patches fail to install, and sometimes the services on them will just stop working. The problem here is that these servers do not work! When they do reboot, they take 12 minutes and 34 seconds to boot (quad-core Xeon 3.6Ghz, 8GB RAM, 15K HDs on RAID 5). During that time, DNS stops working and I get complaints that the internet isn’t working, or it is very slow. For those who know what happens when your primary DNS goes down, you know what I’m talking about. I even had times where after a reboot, the services fail to start. Lets just say, these servers are very broken, and I don’t believe that Microsoft is worth keeping around when so few computers even need to be on Active Directory in the first place.

Lets Do This

The first step would be to find another DNS/DHCP solution. Easy, I’ve done this many times before. So I have the new server ready to put in place. I should note that we have a weird network configuration, and as of the writing of this portion, I have no idea if my solution will work. We have several VLANs and DHCP is “helped” through the system to get to the correct server. I’m not going to go into the technical details of how this all works (partly because I don’t fully understand it myself), so you’ll just have to hope I figured it all out.

As far as VPN goes, pptp is bad, weak, and apparently not a good one to use. I opted for using OpenVPN. Turns out of Windows setup is a little more difficult for the non-tech savvy users, but we will manage.

What Am I Doing?

Now, the nightmare begins. Removing machines from the AD. I was given 2 computers when I started at this company. A Windows laptop, and a desktop where I could install any Linux distribution I want. I started the AD removal on my laptop and everything went perfectly. At this point, I figured, how hard could this be. So I removed it from a Windows Server than manages our phone system (no one touches that machine, we are all too afraid). This one wasn’t too bad, but it did kill the backup user. Once I added it back in, all is good. I’m still getting backups. Next came a very old XP machine that hosts QuickBooks. After removing it from the AD, I couldn’t login! OK, boot a recovery disk, wipe the SAM Tables (Microsoft’s password database), reboot, add password, done! Woohoo… well, no. Turns out it had a share for our CFO. Crap. It took me a while, but I finally got that share working and he was able to access QuickBooks. As before, this one became broken on the backup system, but it was an easy fix as all I had to do was change out the user our backup software uses to connect to the shared folder. All is good.

Before going too far, I want to let you all know that I’m writing this as things are happening. At this point I still have a machine in our conference room, 1 Windows 8, 4 more Windows 7 (one of which I’m worried about since it is our CFO’s machine), and a bunch of XP machines in our dev lab. Yep, XP. We have older equipment where the software will not run on anything above XP, so I have to keep them very hidden and spend a little extra time on them to ensure they are OK.

Anyway, I ended up having another issue when doing out conference room PC. It seems, this computer didn’t keep some of the group policy settings that others did. Granted, I was able to just change the backup user, but I actually had to create another share. What a pain right? To make matters worse, I also had to edit the firewall settings to allow ping (my backup software requires I can ping the machine) from any subnet, and allow SMB access from any subnet. You see, the backup software host and most other computers are on different subnets, so I had to adjust for that. Live and learn I guess.

Pressing On… And On… And On…

Here I am again, Thursday morning (almost a week after starting this whole process), wanting to remove more machines from the AD. Then I thought about it again. Due to the configuration of our switch, and the need to forward DHCP packets correctly to the Windows Servers, what are the chances that this DHCP helper option won’t work correctly in Linux? While the switch does have fantastic documentation, it doesn’t tell you squat about what to do on the DHCP server side. My heart is racing, fears are rising. What if I can’t get this to work? Am I doing to be stuck running Windows Server forever simply because I don’t understand this very complex switch? This thing’s config hasn’t been touched since 2008, and even contacting someone who worked with it back then (pretty cool of the guy to talk to me after not being with the company for over 5 years) proved no help as he worked with 3 other people on getting this thing set up, and he didn’t know the CLI password (which is where I have to change these settings). I can’t get a hold of anyone else from back then. Looks like I’m on my own… again.

Well, I can’t test this today, it would interrupt everyone. Looks like I’m going to have to work on something else for the next couple of days then come in over the weekend.

Fast forward a couple days, and here I am. Easter Sunday… at work. Oh well, it worked out pretty well this way. Sure, I would like to do spending time with family, but since everyone else is, I figure now is the perfect time to take down the network and move everything over.

So it begins, nice and early. First I disconnect the Windows servers from the network, then I change the IP address associated with the Linux box that will control this network. All services now up and running.

Time to Test

Well crap, it seems Windows computers had some issues getting DNS updates. Sometimes I can refer to the other computer by name, sometime by the full name (meaning adding the internal domain name to it), but only sometimes. After spending hours working on it, I still have no solution… I’m starting to think many of these had issues before, and it could have something to do with their own hostnames. After all, Linux likes it when you give it a domain name. Either way, it works as good as before… I think. So I continue on.

Had one Linux server get an IP address outside the DHCP range. I have no idea how this is possible. Screw it, you get a static IP and DNS name. Fixed.

After getting through a couple machines that just didn’t want to place nice. I got through the rest without hardly an issue. It actually went much more smoothly than I thought it would. After a few hours, the network was up and running again!

Now, unfortunately, I wasn’t done yet. There were a few more items that needed to be dealt with. First, the new OpenVPN. Done. Oh… that’s right, just needed to make a simple change, and everything works. I actually forgot to adjust the server IP in the configuration files to reflect the server’s new IP. Tested, and working great. Cool.

D’oh!

What about pptp you ask? Well, yes, I did want to get rid of it. The problem was many of my users were still setup to use it and hadn’t been given their new keys with OpenVPN. I’ll deal with that next week, but the problem remains of ensure they can still get in. So I fire up a machine and get to testing (using a 4g modem so I’m outside the corporate network). Connecting… connecting… verifying username and password (I didn’t know pptp was this slow, holy crap!)… damn. It just isn’t going to work for me. I’ve actually never used the old VPN, so I have no idea if it would have ever worked before or not. I hate to say it, but I think I’m going to have to wait until tomorrow morning and see who complains.

UPDATE: Upon further investigation, I’ve found that Windows 2003 will NOT let you use VPN unless it can run the DNS/DHCP. Shit… Just another reason to move away from Windows.

Up and Running

Now, where was I? Oh right. So, everything is up and running. I found a few machines that were given static IPs from the Windows servers, but were not listed, so once they got their new addresses I found their services not working. This is because of our firewall rules. So I adjusted the rules and set static IPs for those systems, so they should now always keep those addresses.

Most internal DNS is working without having to put in the domain name. I’ll work on the rest of those throughout the week. I’m not anticipating this will cause any issues with my local users (users from the other office still have to type the internal domain name. I’ll work on that later). So Maybe now I’ll finally call it a day. I’ve been here for roughly 12 hours now, and I would like to call it a day.

From this point, all I need to do is gets all the workstations off the AD that no longer exists. I worry there will be issues just leaving them alone since I have no idea how long it will take before these workstations say they can no longer login from the missing AD controller. So over the next week I’ll get this taken care of. Just need to get backups working and any mounted drives working for each user, then I’m done! Oh I can’t wait!

Getting There

Fast forward a bit here and another week has passed. During this time I’ve ended up with just 2 desktops that needed to be taken off the old AD. One is Windows 7, the other is 8.1. During this process of getting off the AD, I do have to reboot and perform some work on the firewall, so I like to set up times where I can do this with each user.

First came the Windows 8.1 machine. Oh man do I hate Windows, especially 8.1. This thing caused all sorts of problems. There is so much hidden shit all over Windows, that is just drives me insane. I couldn’t get his account to login because of group policy. That was the actual error. Access denied by group policy. So I checked the group policy on the computer… there wasn’t any! Eventually I had to just create him another account, under a different name, and copy his personal files over. What a joke. I even had something of the same type of issue with the Windows 7 machine. Which sucked because the guy handles all our finances and I really didn’t want to cause issues for him. His wasn’t nearly as bad, but it ended up as where I had to copy his files over to a new account anyway. Fortunately he was able to keep the same user name. Some things ended up in other folders, so after several hours, we located all his files and got him all setup and good to go. I was pretty happy about that. I always like when things work out pretty well when they seem to be going so badly.

Userland

Speaking on Windows and their userland. I hate how Windows handles this. It makes it very hard to move over to a new machine with all your same settings and everything exactly how you like it. I ended up screwing up one email account because he uses Outlook, and apparently you can’t just “import”. You have to export, then import, and you can’t just copy config files over. This is according to MS! This is why I run Linux. Last time I moved from one machine to another, I copied (scp) my userland over to the new computer, started X (I like KDE), and guess what? Everything was there exactly how I had it on the other computer! Amazing! Also, I know there are tools provided by Microsoft designed to help with this, unfortunately those don’t work in my situation.

DONE!

So here I am, almost 3 weeks after starting this project, and I’m finally done. Every computer is getting backups, they can access and be accessed from other VLANs (I know, I know, that’s not how you use VLANs, shut it, I like it). It has been a pain, and I wouldn’t recommend it to anyone. Especially if you are the ONLY one doing it, and you are not a Windows guy. So in the end, here is my advice. If you are running AD, just keep giving MS all your money and hope it keeps working. If you are not running AD. DON’T! Stay away! If you are going to have it, be sure to hire someone just to handle it, and make sure that is their only job.

Thank you for letting me tell my story, and if you made it this far, good on you!

UPDATE: I wish I saved the link, I found something that MS apparently does with the newer version of Windows Server. I already knew that I have to pay a lot just for the OS, but then I have to pay an additional price for each user, or CAL as they call it. Well, apparently CALs are not just for users of the Active Directory. You have to have one for each machine that uses the DHCP! I have a lot of Linux servers and even more as Virtual Machines. I refuse to keep giving money to MS for each machine just to use DHCP/DNS. That is a load of crap. Some people who commented on the article said you don’t really have to, but if MS decided to audit your network, you could end up having to pay a lot of money. I don’t know how true this is, but I wouldn’t be surprised. Glad I got away from that train wreck.