Lab 1 Provisioning Durable Storage with S3
Amazon S3 is a simple storage for the Internet, by using this we are able to make web-scale computing easier for developers. By using this storage we can store data on it and also fetch on the specific place directly with high speed so the launching time of the web application will be goes down and make is speedy.
In here I already have a local infrastructure and we are going to move it in to the cloud so we can move step by step. In this lab I am going to create a first basic storage on cloud so I am able to store many things like images, code files like java script also design code files like css also move some basic configuration files and after that I am going to the local one and edit some coding line in local to call that things from cloud.
This lab Will cover following steps which is as bellow,
- Create bucket and upload files over there.
- Make them public so every one and from anywhere.
- Change available link for images and also some in config file to call script files and put Amazon S3 url over there to fetch from S3.
Bellow is the architecture for this lab,
Step 1. Firstly I am going to create two different bucket in S3 with a unique name. Make sure you have to give a unique name for each bucket. First is dinostoreresources-ravi and second one is dinostoredegrage-ravi. You can create more bucket to differentiate all the stuff which is using in your application and then call from different bucket.
Step 2. After creating bucket just click on dinostoreresourcesravi bucket to folders. List of folders are as bellow,
Step 3. After creating folders just upload images in proper folder like for product images will be uploaded in productimages folder and for site image like logo and all will be uploaded in siteimages folder. To upload images from the local please follow the local path as bellow,
\NET702.DinoStore\NET702.DinoStore\Content\images\product*.png (N.B: where * means there are more then one image using same prefix, upload all of them)
For Site Image folder upload follow images from the local path as given bellow,
\NET702.DinoStore\NET702.DinoStore\Content\images\glyph*.png (N.B: where * means there are more then one image using same prefix, upload all of them)
Step 4. After uploading images just go to the css file in the local DinoStore project folder which is in given path as bellow,
Before that just click particulate image and go to the properties and take a URL for that image and then open this file and just change the URL for the background-image as i saw bellow,
Step 5. First change the URL of the background image and after upload that same css file in css folder and two js file which I have to upload in js folder, the full file directory for this files are as bellow which is related to local project for DinoStore,
For CSS file follow path which is as bellow,
For js files follow local path as bellow,
Step 6. After uploading a css and js files on S3 bucket go to the local file where this files will be called and change the URL, I will go to the index.html, See this changes in given pictures as bellow,
After change the URL you also can change the Welcome message as given bellow picture and make it your own message as you want,
Step 7. After that open other S3 bucket which name is dinostoredegradedravi and upload that index.html file form the local to cloud and make it public to access for everyone from this bucket,
After upload this file to bucket just right click and click on make public and then click ok to give a permission to make this file public as show in bellow picture,
The main purpose to make all folders and file public is to allow to access from any where using URL of the objects. If you do not do it then no one can allow to access it.
Step 8. After uploading this file put this bucket as a website by selecting enabling property from the web hosting properties and give this index file over there, You can host your static website entirely on Amazon S3. Once you enable your bucket for static website hosting, all your content is accessible to web browsers via the Amazon S3 website endpoint for your bucket.
You can also upload error,html page and put this page in error document field,
After upload the error file just click on it and make it public,
After that go to the property and add this page name in error document and hit on save,
Step 9. After give enabling files for web hosting, go to the other bucket dinostoreresources make all folder public,
Step 10. Change the URL for bootstrap.css, jquery-2.0.2.js, bootstrap.js, and also logo of the site in site1.master page in local project form Visual Studio,
Bellow Picture show the URL for logo image,
After all this step just check the URL of the images as in bellow you will see a logo URL which is coming from S3 bucket,
In bellow image you will see a product image which is also belong to the S3 bucket,
At the end I can say that using Amazon S3 bucket data can transfer more faster but we can use only for some some important data like back for the company server and also some data which requires high available.
Basically a new user get up to 5 GB storage on the new registration with AWS and 20,000 GET and 2000 POST request and up to 15 GB of data transfer for each month for each year. If I talk about storage charges it is base on on which region are going to use like if i use Asia Pacific then for first 50 GB the charges is $0.025 per GB, in request services only delete will be free apart from it all are chargeable like for POST it will be around $0.0055 per 1000 request and for GET request $0.0044 per first 10,000 request. For data transfer if I want to transfer inside S3 then it will be free but if it will happen with other S3 or other region S3 then charges will be apply and it will be different.
So, in this case apart from S3 there is another storage name is Glacier, for storage charge is only $0.005 per GB which is really less then S3 also for request they are charge only for upload which is only $0.060 per 1000 request but in here this will be for first 90 days if I delete before that they will charge for 90 days. At the end I believe that, just put only highly secure and more active data in S3 apart from that move data in to Glacier so it will beneficial for the company.