Showing posts with label script. Show all posts
Showing posts with label script. Show all posts

Thursday, January 14, 2016

Bash script to email new S3 bucket files as compressed attachments (UDPATED)

I've written a simple bash script that checks for new files in an AWS S3 bucket and emails any that it finds as a compress (tar.gz) attachment - you can find it at my Github account under the name "S3-Filer-Mailer". I built it as a supplement for a contact form that relies on S3 as a back-end, rather than a php mailer or database. Using S3 for contact forms is attractive because it is so unattractive to spammers. There is no way to corrupt this sort of setup for spamming or to get hands on a database through the form, because it isn't connected to one.

Why not use Amazon's Simple Notification Service (SNS)? For one, AWS charges more for SNS than it does for S3 queries and downloads. For another, if this sort of functionality is available through SNS it is not clearly documented.

Getting back to the topic of security, the script establishes two network connections - one a connection to S3 to retrieve the files, the other sending the email. The S3 connection is encrypted using TLS; I'm going to add an extra pipe in here to gpg2 as time permits to encrypt the attachments themselves to close the loop - or you can do it yourself by adding a line with gpg -e -r Name foo.txt, where Name is the name you used while generating the public key you wish to use to encrypt the file. Adding encryption support as a command line operator is easy, but I want to add it as part of more general sanity-checking input.

The script was built and tested on RHEL, but it should work in any Linux that supports bash. This is pre-pre-alpha version, so no complaining. The obvious and immediate functionality problem ATM is that the script assumes that only files that contain a string with today's date in their filename were created today (plus the string has to be in format YYYYMMDD). When my copious spare time allows I will get to adding an option to filter results via regex; for now users can do this fairly simply by piping an additional grep command between grep ${TODAY} and > ${FILE} on line 16 of S3-Filer-Mailer.sh.

The script includes two files, an executable file (S3-Filer-Mailer.sh) and a configuration file (S3-Filer-Mailer.conf). To get things working, move both files to a computer running Linux and modify the S3-Filer-Mailer.conf file settings; that is where you will specify your email address and your S3 bucket. You can also limit the script to a subdirectory of your bucket in the conf file. The script is recursive, so if you specify the root directory of your bucket it will check every subdirectory. For the time being, that is the only way to specify multiple subdirectories; similarly disabling recursiveness requires modifying the executable.

Also, dependencies. There are some. Only one of them should take more than 5 seconds to install, the AWS Command Line Interface. You will need Python for that if you don't already have it. On the bright side, if you want to do cool stuff with AWS and you are using linux you should be happy to drag more crap to a CLI, right? The only other dependency is mailx.

UPDATE: I've moved this from a gist to a full-fledged Github repo, and I've made a few updates that make this script significantly less lame.

The earliest version of this required sharutils to uuencode attachments, but that is no longer necessary. Relying entirely on mailx encoding also resolved an ongoing issue in which Mozilla Thunderbird did not properly recognize attachments.

Variables that need to be changed in order for the script to function have been placed into a separate .CONF file.

Thursday, December 13, 2012

Simple T-SQL Script to Kill All Connections to a MSSQL Database


When restoring an MSSQL database, it is common to receive an error similar to this one: 

"The database could not be exclusively locked to perform the operation."
Msg 5030, Level 16, State 12, Line 1 

This error can occur in MSSQL 2005, 2008 or 2012.
There are a number of workarounds for this. Some will tell you to disable the website's application pool in IIS, but this doesnt help if someone remotely is connected using SQL Management Studio. You can go into Performance Monitor and manually kill each process, but often the processes jump up faster than you can disable them by hand. 
Where that is the case, perform the following procedure. Assuming your restore job task window is properly configured and just open, keep it open. You can right click on the database server and select New Query without closing the restore job task window, which wastes time when you must recreate it. Then, enter the following script into the query menu (be sure to update the variable @DB_NAME with the actual name of your database). This script will sever all connections immediately. You can then left click on your restore job task window and click OK to retry the restore. Assuming the restore task is correctly configured, it will complete without further issue. 

USE master
GO

SET NOCOUNT ON
DECLARE @DBName varchar(50)
DECLARE @spidstr varchar(8000)
DECLARE @ConnKilled smallint
SET @ConnKilled=0
SET @spidstr = ''

Set @DBName = '@DB_NAME'
IF db_id(@DBName) < 4
BEGIN
PRINT 'Connections to system databases cannot be killed'
RETURN
END
SELECT @spidstr=coalesce(@spidstr,',' )+'kill '+convert(varchar, spid)+ '; '
FROM master..sysprocesses WHERE dbid=db_id(@DBName)

IF LEN(@spidstr) > 0
BEGIN
EXEC(@spidstr)
SELECT @ConnKilled = COUNT(1)
FROM master..sysprocesses WHERE dbid=db_id(@DBName)
END

(hat tip to Adam Lacey)

RAT Bastard

Earlier this week, several servers I maintain were targeted by automated attempts to upload a remote access trojan (RAT). The RAT is a simpl...