Fix Ubuntu Apt error “public key is not available” GPG Keys

If you’ve ever used the a package or repository from the Ubuntu Personal Pakacge Archive, you may have receive an error when each time you use apt-get. The error looks something like this.

W: GPG error: http://ppa.launchpad.net lucid Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY D8D75E403EBCE749

 

It’s a simple one line fix, the one line fetches the GPG key for the repository from keyserver.ubuntu.com:

apt-key adv --keyserver keyserver.ubuntu.com --recv-keys D8D75E403EBCE749

Once you’ve run this command, apt-get will no longer report errors with finding the correct GPG public key.

 

Ubuntu apt-key not working? Behind a Firewall? Open port port 11371 or use http://:80

I was trying to get some Ubuntu PGP Key’s for some ppa.launchpad.net sources, and apt-key kept erroring out. All I had to do was open up 11371 on my firewall or just use http://keyserver.ubuntu.com:80 which would force the HTTP protocal instead of HKP.

It’s that simple actually.

http://superuser.com/questions/64922/how-to-work-around-blocked-outbound-hkp-port-for-apt-keys

OtterBox Defender Series iPhone 4S Warranty Pictures

The OtterBox Warranty Department asked that I provide them with pictures of the problematic OtterBox I purchased. Here they’re, as you can see the bottom iTunes Port cover has a small rubber line that is suppose to fit into the hard shell case, mine was defective upon receiving it. It worked fine for a while then just stopped working completely.

Why DomainPeople’s New Domain Manager Sucks

DomainPeople upgraded their Domain Name Manager, which now doesn’t allow the creation of Name Servers based on your domain name. So you’re required to submit a ticket to create/update your Name Servers based on your Domain Name. I’m so glad I moved to Hexonet, because I have direct access to this.

UPDATE: If you read the comments below, Andrew has pointed out that DomainPeople has added an option for “Register Nameservers” which should then allow you to register nameservers based on your domain.

Configuring MySQL for Low Memory VPS

I’ve found the following configuration helps when you’re trying to squeeze out as much memory from a VPS as possible.

 


[mysqld]
port = 3306
socket = /var/lib/mysql/mysql.sock
skip-locking
key_buffer = 16K
max_allowed_packet = 1M
table_cache = 4
sort_buffer_size = 64K
read_buffer_size = 256K
read_rnd_buffer_size = 256K
net_buffer_length = 2K
thread_stack = 64K

# For low memory, Berkeley DB should not be used so keep skip-bdb uncommented unless required
skip-bdb

# For low memory, InnoDB should not be used so keep skip-innodb uncommented unless required
skip-innodb

# Uncomment the following if you are using InnoDB tables
#innodb_data_home_dir = /var/lib/mysql/
#innodb_data_file_path = ibdata1:10M:autoextend
#innodb_log_group_home_dir = /var/lib/mysql/
#innodb_log_arch_dir = /var/lib/mysql/
# You can set .._buffer_pool_size up to 50 - 80 %
# of RAM but beware of setting memory usage too high
#innodb_buffer_pool_size = 16M
#innodb_additional_mem_pool_size = 2M
# Set .._log_file_size to 25 % of buffer pool size
#innodb_log_file_size = 5M
#innodb_log_buffer_size = 8M
#innodb_flush_log_at_trx_commit = 1
#innodb_lock_wait_timeout = 50

[mysqldump]
quick
max_allowed_packet = 16M

[mysql]
no-auto-rehash
# Remove the next comment character if you are not familiar with SQL
#safe-updates

[isamchk]
key_buffer = 8M
sort_buffer_size = 8M

[myisamchk]
key_buffer = 8M
sort_buffer_size = 8M

[mysqlhotcopy]
interactive-timeout

 

Pasword Protected Sub Directories Missing on Index Listing with Apache2

If you’ve setup a folder that allows listing of the directory index, you will be presented with that directories contents. Folders and files will be show and you can navigate through each folder. However, if you’ve password protected a sub directory, it will not show up in the listing. This is because its considered to be protected and the user should not be able to access it.

To override this option you must place the following into your root .htaccess file, not the one protecting the folder.

IndexOptions +ShowForbidden

http://httpd.apache.org/docs/current/mod/mod_autoindex.html

Apache dDoS “killapache” Sends Malformed GET Requests

There is a bug in certain versions of Apache that is susceptible to a dDoS using malformed GET requests. You can find more information about “killapache” at the following site.

http://www.pentestit.com/2011/08/25/killapache-ddos-tool-freezes-apache-web-server/

 

There is no patch, however their is a work around for this bug which is provided below. In Ubuntu or Debian create a new file “/etc/apache2/conf.d/killapache-fix” and place the following code

 


# Drop the Range header when more than 5 ranges.
# CVE-2011-3192
SetEnvIf Range (,.*?){5,} bad-range=1
RequestHeader unset Range env=bad-range

# optional logging.
CustomLog logs/range-CVE-2011-3192.log common env=bad-range

You may need to install the “headers” module, which can be done by typing “a2enmod headers” and then you should be able to reload apache without errors.

 

Using “robocopy” to Copy EFS Files

When required to move large amounts data from one location to another, we use robocopy. Why? It’s basically like rsync for Linux, it copies NTFS Security ACL’s and will do file compare on every file to ensure that the destination has the most current version of the file.

So when moving over 50 + user folders, it comes in handy. We simply run robocopy a couple of days before to do the large sync of data. Then we run it during a maintenance window to update any files that have been modified or created.

It works great! But its biggest feature is that it will copy EFS files without the need to have the EFS certificate/key. Using the /EFSRAW switch allows for the copy of encrypted files from one location to another without having to decrypt the data.

Fore more information on the robocopy syntax please see the following site:

http://ss64.com/nt/robocopy.html