☁ Amazon AWS S3 Service – #Cloud #Storage



Amazon Simple Storage Service (Amazon S3), provides developers and IT teams with secure, durable, highly-scalable object storage. Amazon S3 is easy to use, with a simple web services interface to store and retrieve any amount of data from anywhere on the web. With Amazon S3, you pay only for the storage you actually use. There is no minimum fee and no setup cost.

Amazon S3 provides a highly durable storage infrastructure designed for mission-critical and primary data storage. Amazon S3 redundantly stores data in multiple facilities and on multiple devices within each facility.

To increase durability, Amazon S3 synchronously stores your data across multiple facilities before confirming that the data has been successfully stored. In addition, Amazon S3 calculates checksums on all network traffic to detect corruption of data packets when storing or retrieving data. Unlike traditional systems, which can require laborious data verification and manual repair, Amazon S3 performs regular, systematic data integrity checks and is built to be automatically self-healing.

Amazon S3 provides additional security with Multi-Factor Authentication (MFA) Delete. When enabled, this feature requires the use of a multi-factor authentication device to delete objects stored in Amazon S3 to help protect previous versions of your objects.

amzn aws s3

By enabling MFA Delete on your Amazon S3 bucket, you can only change the versioning state of your bucket or permanently delete an object version when you provide two forms of authentication together:

Your AWS account credentials
The concatenation of a valid serial number, a space, and the six-digit code displayed on an approved authentication device
Amazon S3 enables you to utilize Amazon Glacier, an extremely low-cost storage service for data archival. Amazon Glacier stores data for as little as $0.01 per gigabyte per month, and is optimized for data that is infrequently accessed and for which retrieval times of 3 to 5 hours are suitable. Examples include digital media archives, financial and healthcare records, raw genomic sequence data, long-term database backups, and data that must be retained for regulatory compliance.

Amazon S3 makes it easy to manage your data. With Amazon S3’s data lifecycle management capabilities, you can automatically archive objects to the lower-cost Amazon Glacier storage or perform recurring deletions, enabling you to reduce your costs over an object’s lifetime. Amazon S3 also allows you to monitor and control your costs across your different business functions. All of these management capabilities can be easily administered using the Amazon S3 APIs or console. The various data management features offered by Amazon S3 are described in detail below.

The AWS Free Tier is designed to enable you to get hands-on experience with AWS at no charge for 12 months

AWS Import/Export accelerates moving large amounts of data into and out of AWS using portable storage devices for transport. AWS transfers your data directly onto and off of storage devices using Amazon’s high-speed internal network and bypassing the Internet. For significant data sets, AWS Import/Export is often faster than Internet transfer and more cost effective than upgrading your connectivity. You can use AWS Import/Export for migrating data into the cloud, distributing content to your customers, sending backups to AWS, and disaster recovery.



Read more about VMware Virtualization for Dummies 2.0


Published by Wiley


Published by Wiley, this VMware Special Edition of Virtualization 2.0 For Dummies can be downloaded
 click Here to Download:

Read More: Amazon Web Services (AWS)


☁ Microsoft Azure #Cloud #Storage Blob

azure blob - microsoft

Microsoft Azure – Use metadata

Azure Blob storage is a service for storing large amounts of unstructured data, such as text or binary data, that can be accessed from anywhere in the world via HTTP or HTTPS. You can use Blob storage to expose data publicly to the world, or to store application data privately.

Common uses of Blob storage include:

– Serving images or documents directly to a browser
– Storing files for distributed access
– Streaming video and audio
– Performing secure backup and disaster recovery
– Storing data for analysis by an on-premises or Azure-hosted service

Blob Addressing format:

URL format: Blobs are addressable using the following URL format:
 http://<storage account>.blob.core.windows.net/<container>/<blob>
The following example URL could be used to address one of the blobs in the diagram above:


Microsoft Azure Account

How to Create Microsoft Azure Storage Account

The Blob service contains the following components:

The blob service supports head requests, which can include metadata about the blob. For example, if your application needed the EXIF (Exchangeable Image File)  data out of a photo, it could retrieve the photo and extract it.

To save bandwidth and improve performance, your application could store the EXIF data in the blob’s metadata when the application uploaded the photo: you can then retrieve the EXIF data in metadata using only a HEAD request, saving significant bandwidth and the processing time needed to extract the EXIF data each time the blob is read. This would be useful in scenarios where you only need the metadata, and not the full content of a blob. Note that only 8KB of metadata can be stored per blob

EXIF-  stands for Exchangeable Image File, and the data provided can be stored to JPEG, RAW and TIFF image file formats.

Azure Import/Export Service

For very large volumes of data (more than 1TB), the Azure Storage offers the Import/Export service, which allows for uploading and downloading from blob storage by shipping hard drives. You can put your data on a hard drive and send it to Microsoft for upload, or send a blank hard drive to Microsoft to download data. You can read more about it here. This can be much more efficient than uploading/downloading this volume of data over the network.

Microsoft Azure

Read More: Microsoft – Blob Storage

Uploading Blob Fast

To upload blobs fast, the first question to answer is: are you uploading one blob or many? Use the below guidance to determine the correct method to use depending on your scenario.

Uploading one large blob quickly

To upload a single large blob quickly, your client application should upload its blocks or pages in parallel (being mindful of the scalability targets for individual blobs and the storage account as a whole). Note that the official Microsoft-provided RTM Storage Client libraries (.NET, Java) have the ability to do this. For each of the libraries, use the below specified object/property to set the level of concurrency:

.NET: Set ParallelOperationThreadCount on a BlobRequestOptions object to be used.
Java/Android: Use BlobRequestOptions.setConcurrentRequestCount()
Node.js: Use parallelOperationThreadCount on either the request options or on the blob service.
C++: Use the blob_request_options::set_parallelism_factor method.

note: Blob data gathered from Azure Site


Read about:

NetApp for Microsoft Private CloudDeployment Guide

Read about NetApp Products: Click Here


Read more about VMware Virtualization for Dummies 2.0


Published by Wiley


Published by Wiley, this VMware Special Edition of Virtualization 2.0 For Dummies can be downloaded
 click Here to Download:

Read More: Amazon Web Services (AWS)