# Hosting a local file to a remote server and password protecting it

• ssh into the server where you want your files to be hosted.

ssh username@linux.andrew.cmu.edu  [Enter your password when prompted]

• Go to the folder where you want to host the files.
• Create a .htaccess file and a .htpasswd file.
• I used vim. Go to this site and create a htpasswd file that is hashed. Basically enter the username and password you want to protect the folder with, and it will give you a string. Copy that string in your .htpasswd file and save it.
• Copy the following to your .htaccess file

 AuthUserFile /home/content/10/9290610/html/appliance_data/.htpasswd AuthGroupFile /dev/null AuthName "EnterPassword" AuthType Basic require valid-user 

• Go to the source folder directory and secure copy it to the remote server
 scp -rp Source_Folder user@server.org:destination_folder_path

## 5 thoughts on “Hosting a local file to a remote server and password protecting it”

1. You can also generate the password file with htpasswd, saving you from having to go to a website to do it. Also, this works if the remote server is running an Apache web server.

• Thanks for the tip Mario. Also, I uploaded all the folders in the server, and now I am trying to compress them there but it is not letting me zip. Particularly, for the bigger folders that has 200 files each of 200 Mb, after around 10 files it gives “zip error: Interrupted (aborting)” error. Is there a way to fix it ?

• I haven’t run into that problem before, what command are you running to zip them?

• zip -r [filename] [filename]. It compresses the little ones fine, but after about 12 200 mb files, it gives me the error

• Hmm… not sure. I’d google for a solution if I were you. I think it could be due to overflowing the input with the wildcard that you are using to do the recursion, but I’m not sure. I’d be curious to know what the solution is, though.