Sie sind auf Seite 1von 60

AngularJS & Cloud

CloudConf 2014 Gabriele Mittica


www.corley.it
JavaScript & Cloud?
Cloud based database and storage services are very popular.
Why?
The Cloud is
Cheap for startup projects
Ready to scale for growing projects
Rich of services for complex projects
How we can use the cloud with JS?
FrontEnd BackEnd
HTML/JS REST PHP
MySQL / Mongo
HTTP
HTML/JSS
AWS
CLOUD
With AngularJS we can create
apps that work with RESTful
resources or directly with cloud
services.
Next steps
#1 - signup to AWS website
#2 - access to the AWS Web console
#3 - create a IAM user with access to all services
#4 - download and use the JavaScript SDK
Introducing Amazon Web Services
Over 25 cloud based services available
Several regions across the world
JavaScript SDK available
http://aws.amazon.con
Signup to AWS on aws.amazon.com
IAM: Identity and Access Management
AWS Identity and Access Management (IAM) enables us to securely control access
to AWS services and resources for our users, setting users and groups and using
permissions to allow and deny their access to AWS resources.
Backup
system
AWS
Storage
AWS
Email Service
Marketing
app
PUT & GET
FULL ACCESS
YOUR APP
AWS SERVICES
AWS IAM
We create an user (or a group of
users) with Power User Access
level, in order to grant the access
to all services.
Then, we have to download the
access and secret keys that well
use with the JS SDK.
Now we can use the JS/Browser AWS SDK
Paste in your HTML:
<script src="https://sdk.amazonaws.com/js/aws-sdk-2.0.0-rc6.min.js"></script>
available on http://aws.amazon.com/javascript
Configure with your IAM credentials:
<script>
AWS.config.update({accessKeyId: 'akid', secretAccessKey: 'secret'});
AWS.config.region = eu-west-1'; //set your preferred region
</script>
Upload a file to Amazon Simple Storage Service with classic JS:
<input type="file" id="file-chooser" />
<button id="upload-button">Upload to S3</button>
<div id="results"></div>
<script type="text/javascript">
var bucket = new AWS.S3({params: {Bucket: 'myBucket'}});
var fileChooser = document.getElementById('file-chooser');
var button = document.getElementById('upload-button');
var results = document.getElementById('results');
button.addEventListener('click', function() {
var file = fileChooser.files[0];
if (file) {
results.innerHTML = '';
var params = {Key: file.name, ContentType: file.type, Body: file};
bucket.putObject(params, function (err, data) {
results.innerHTML = err ? 'ERROR!' : 'UPLOADED.';
});
}
else {
results.innerHTML = 'Nothing to upload.';
}
}, false);
</script>
Thats all!
Very Easy!
The
browser
AWS
Storage
File uploaded
Our keys (primarily the secret one) are exposed.
Bad guys (backend developers?) could use our
keys for malicious intents!
<?php
use Aws\S3\S3Client;
$client = S3Client::factory(array(
'key' => 'our key',
'secret' => 'our key'
));
while(true) {
$result = $client->putObject(array(
'Bucket' => myBucket,
'Key' => 'data.txt',
'Body' => Give me a million dollars!'
));
}
Solutions:
#1
Use read-only
IAM keys
#2
Ask user to write
own keys
#3
Work with
own backend
Read-only
app
Hard IAM
management
Target
missed
The #4 solution
We can use
AWS Security Token Service to grant temporary credentials
for non autheticaded users.
They are called Federated Users.
HTML/JS APP
STS
S3
DB
IAM
LOGIN WITH
ASSUME ROLE
ACCESS TO SERVICES
OK
OK
OK
Create an app that helps users to store incomes/expenses and track cashflow.
So we need:
- a database service where store private cashflow entries
- a storage service where upload private files (receipts, invoices, bills)
- a authentication service that manages the access to database and storage
- an AngularJS app that merge al previous
Example
Simple Storage Service
DynamoDB
IAM
STS
Step #1: set the storage service
Simple Storage Service (S3) is a
Cloud storage that lets us to PUT and GET
private (backup, private images)
and public (js, css, public images) files.
We just have to create a bucket
(folder) in S3 where well store the
files uploaded.
Step #2: set the database service
DynamoDB is a fully managed NoSQL
database stored in the cloud.
We pay for the throughput setted.
For example:
10 reads / 5 writes per sec = free
100 reads / 25 writes per sec = $31.58/month
We just have to create a new
table where store the users
incomes and expenses.
We set a low throughput for
the begeinning.
We have to choose the indexes for the table.
We set a primary key(string type) called userIDthat will be useful later.
We set also a range key (numeric type) called timestamp that lets us query
quickly the entries ordering by insert datetime.
We want to manage the authentication with certified external websites
such as Amazon, Google and Facebook.
Step #3: create federated apps
Go to http://login.amazon.com websites and create a new app.
There is possible get the code for the login, as following:
Creating an app we get an ID and we can set
allowed source (the url of our test/production
web application). HTTPS is required.
Go back to http://aws.amazon.com/console,
and add a new role in the IAM area linked to our Amazon App.
Step #4: create the IAM role
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "sts:AssumeRoleWithWebIdentity",
"Principal": {
"Federated": "www.amazon.com"
},
"Condition": {
"StringEquals": {
"www.amazon.com:app_id": "XXYYZZ"
}
}
}
]
}
IAM lets users from our Amazon Login app to assume role:
We add policy to this role giving
full access to S3 and DynamoDB
thanks to the policy generator:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1291088462000",
"Effect": "Allow",
"Action": [
"s3:*"
],
"Resource": [
"arn:aws:s3:::financeapptest"
]
},
{
"Sid": "Stmt1291088490000",
"Effect": "Allow",
"Action": [
"dynamodb:*"
],
"Resource": [
"arn:aws:dynamodb:eu-west-1:728936874546:table/finance"
]
}
]
}
This is an example of the policy generated (full access to S3 bucket and DynamoDB table):
<div id="amazon-root"></div>
<script type="text/javascript">
window.onAmazonLoginReady = function() {
amazon.Login.setClientId('YOUR-CLIENT-ID');
};
(function(d) {
var a = d.createElement('script'); a.type = 'text/javascript';
a.async = true; a.id = 'amazon-login-sdk';
a.src = 'https://api-cdn.amazon.com/sdk/login1.js';
d.getElementById('amazon-root').appendChild(a);
})(document);
</script>
Get your code and add after <body>
<script type="text/javascript">
document.getElementById('LoginWithAmazon').onclick = function() {
options = { scope : 'profile' };
amazon.Login.authorize(options, 'https://www.example.com/handle_login.php');
return false;
};
</script>
Create a button (#LoginWithamazon) and the click event:
new AWS.STS().assumeRoleWithWebIdentity({
RoleArn: the-arn-of-the-role,
RoleSessionName: the-name-of-the-role,
WebIdentityToken: ACCESS_TOKEN,
ProviderId: "www.amazon.com"
}, function(err, data){
if(data && data.Credentials) {
console.log(data); //we get the Amazon User ID
}
});
};
User is redirected to https://www.example.com/handle_login.php?ACCESS_TOKEN=XYZ
angular.module('myApp.services', [])
.factory('loggerManager', function(configLogger, $location, $rootScope){
var baseFactory = {
handler: new AWS.STS(),
provider: false,
credentials: {},
id: false
};
baseFactory.logout = function() {
if(baseFacory.provider == "amazon") {
amazon.Login.Logout();
}
};
baseFactory.login = function(provider, data, redirect) { }
.
var dynamo = AWS.DynamoDB({region: "eu-west-1"});
dynamo.putItem({
TableName: "finance",
Item: data
});
Now we can PUT data to DynamoDB
var bucket = new AWS.S3({params: {Bucket: 'financeapptest'}});
var fileChooser = document.getElementById('file-chooser');
var file = fileChooser.files[0];
if (file) {
var params = {Key: file.name, ContentType: file.type, Body: file};
bucket.putObject(params, function (err, data) {
console.log(data);
});
}
and upload files to S3
How to do that with:
'use strict';
angular.module('myApp', [
'ngRoute',
'myApp.filters',
'myApp.services',
'myApp.directives',
'myApp.controllers'
]).
.constant('configAWS', {
tableName: "finance5",
bucketName: "financeuploads",
region: "eu-west-1"
})
.constant('configLogger', {
amazonAppId:your-amazon.com-app-id',
amazonRoleArn: 'arn:aws:iam::xxxxxx:role/amazon-login',
amazonRoleName: "amazon-login",
});
When you start the APP you have to set the Amazon.com app id and AWS role credentials:
Thanks to that you have a configuration available along your app.
Now we have to find a way to work with cloud services integrating
the AWS SDK in our app. There are several ways to to that with
AngularJS. In this case we create factory services to wrap each
needed feature.
Firstly, a service to manage the auth.
'use strict';
angular.module('myApp.services', [])
//provide methods to manage credentials of federated user
.factory('loggerManager', function(configLogger, $location, $rootScope){
var baseFactory = {
handler: new AWS.STS(),
provider: false,
credentials: {},
id: false
};
/**
* logout method (based on ID provider)
*/
baseFactory.logout = function() {
if(baseFacory.provider == "amazon") {
amazon.Login.Logout();
}
};
/**
* login method (based on provider)
* @param provider the name of provider
* @param data data used for the login
* @param redirect the destination after login
*/
baseFactory.login = function(provider, data, redirect) {
//get the access params from AWS with the amazon login
if(provider == "amazon") {
AWS.config.credentials = new AWS.WebIdentityCredentials({
RoleArn: configLogger.amazonRoleArn,
ProviderId: 'www.amazon.com', // this is null for Google
WebIdentityToken: data.access_token
});
//assume role from AWS
baseFactory.handler.assumeRoleWithWebIdentity({
RoleArn: configLogger.amazonRoleArn,
RoleSessionName: configLogger.amazonRoleName,
WebIdentityToken: data.access_token,
ProviderId: "www.amazon.com"
}, function(err, data){
//login ok
if(data && data.Credentials) {
baseFactory.provider = provider;
baseFactory.credentials = data.Credentials;
baseFactory.id = data.SubjectFromWebIdentityToken;
if(redirect) {
$location.path(redirect);
$rootScope.$apply();
}
}
});
}
};
/**
* return the access key provided by amazon, google, fb...
*/
baseFactory.getAccessKeyId = function() {
if(baseFactory.credentials.AccessKeyId) {
return baseFactory.credentials.AccessKeyId;
}
else {
return "";
}
};
/**
* return the id provided by amazon, google, fb...
*/
baseFactory.getSecretAccessKey = function() {
if(baseFactory.credentials.SecretAccessKey) {
return baseFactory.credentials.SecretAccessKey;
}
else {
return "";
}
};
/**
* return the user id
*/
baseFactory.getUserId = function() {
if(baseFactory.id) {
return baseFactory.id;
}
else {
return "";
}
};
return baseFactory;
})
Then, a service to work with S3. This is a tiny example:
// provides methods to put and get file on S3
.factory('s3Ng', function(configAWS, loggerManager){
var baseFactory = {
handler:false
};
/**
* start the service
*/
baseFactory.build = function() {
baseFactory.handler = new AWS.S3({params: {Bucket: configAWS.bucketName}});
};
/**
* put file on the cloud storage
* @param fileName
* @param fileBody
*/
baseFactory.put = function(fileName, fileBody) {
var params = {Key: loggerManager.provider + "/" + loggerManager.getUserId()
+ "/" + fileName, Body: fileBody};
baseFactory.handler.putObject(params, function (err, data) {
console.log(data);
});
};
return baseFactory;
})
Wotking with Dynamo is more complex. This is an example:
.factory('dynamoNg', function (configAWS, loggerManager) {
var baseFactory = { handler:false };
//build the servic
baseFactory.build = function() {
baseFactory.handler = new AWS.DynamoDB({region: configAWS.region});
};
/**
* put an element in to dynamo table. Data is a formatted json for dynamo
* @param table name
* @param data are the data in JSON formatted for DynamoDB
* @return the result of the query
*/
baseFactory.put = function(table, data) {
return baseFactory.handler.putItem({
TableName: table,
Item: data
});
};
/**
* Get an element from a DynamoDB table
* @param table name
* @param data the key to fetch
* @return elements by the table
*/
baseFactory.get = function(table, data) {
console.log("getting");
return baseFactory.handler.getItem({
TableName: table,
Key: data
});
};
/**
* parse the dynamo data
* @param the data
* @returns the data extracted
*/
baseFactory.reverseModel = function(response) {
var result = [];
if(response.data.Count) {
for(var ii in response.data.Items) {
var item = response.data.Items[ii];
result[ii] = {};
for(var kk in item) {
if(item[kk].S) {
result[ii][kk] = item[kk].S;
}
if(item[kk].N) {
result[ii][kk] = item[kk].N;
}
//binary type is missing!
}
}
}
return result;
};
return baseFactory;
})
;
// provides methods to put and get file on S3
.factory('s3Ng', function(configAWS, loggerManager){
var baseFactory = {
handler:false
};
/**
* start the service
*/
baseFactory.build = function() {
baseFactory.handler = new AWS.S3({params: {Bucket: configAWS.bucketName}});
};
/**
* put file on the cloud storage
* @param fileName
* @param fileBody
*/
baseFactory.put = function(fileName, fileBody) {
var params = {Key: loggerManager.provider + "/" + loggerManager.getUserId()
+ "/" + fileName, Body: fileBody};
baseFactory.handler.putObject(params, function (err, data) {
console.log(data);
});
};
return baseFactory;
})
In a controller, start the auth:
.controller('HomeCtrl', function($scope) {
// login button to auth with amazon.com app
document.getElementById('LoginWithAmazon').onclick = function() {
var options = { scope : 'profile' };
amazon.Login.authorize(options, '/dynamofinance/app/#/logged/amazon');
return false;
};
})
And a controller to manage login (after app auth) and logout:
.controller('LoginCtrl', function($scope, $routeParams, loggerManager) {
//user comes back from amazon.com app login success
if($routeParams.access_token) {
//do the login with the provider got by the url
loggerManager.login($routeParams.provider, $routeParams, "/finance/list");
};
})
.controller('LogoutCtrl', function($scope, $routeParams, loggerManager) {
loggerManager.logout();
})
In a controller, how to work with services:
.controller("FinanceCtrl", function($scope, $routeParams, dynamoNg, dynamoFinanceTable, s3Ng,
loggerManager, configLogger, configAWS){
//build services
dynamoNg.build();
s3Ng.build();
//.... More code here
//upload file to S3
$scope.uploadFile = function() {
s3Ng.put("your filename", $scope.upload);
$scope.entryId = false;
};
//store movement
$scope.add = function(el) {
//prepare the data to store
el.date = el.date.toString();
var movement = dynamoFinanceTable.modelAmount(el);
//store the data
$scope.putMovement(movement);
$scope.formReset(false);
};
$scope.putMovement = function(movement) {
dynamoNg.put(configAWS.tableName, movement)
.on('success', function(response) {
$scope.entryId = response.request.params.Item.date.S;
$scope.$apply();
})
.on('error', function(error, response) { console.log(error); })
.send();
};
//... More code here
});
You can find an example on GitHub: its a work-in-progress app,
dont use in production. Its under dev and test.
https://github.com/gmittica/angularjs-aws-test-app
But our work is not finished.
The files & data that were storing in AWS
are protected by unauthorized users,
but are fully visible by other authorized users.
Each user has access to data of the other ones.
Security problem
We have to refine the policies adding fine-grained conditions.
Step #5: fix the role policy
In Simple Storage Service, we can limit the access of each user to a
specific subfolder called with his userId.
{
"Effect":"Allow",
"Action":[
"s3:ListBucket"
],
"Resource":[
"arn:aws:s3:::financeuploads"
],
"Condition":{
"StringLike":{
"s3:prefix":[
"amazon/${www.amazon.com:user_id}/*"
]
}
}
},
{
"Effect":"Allow",
"Action":[
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject"
],
"Resource":[
"arn:aws:s3:::financeuploads/amazon/${www.amazon.com:user_id}",
"arn:aws:s3:::financeuploads/amazon/${www.amazon.com:user_id}/*"
]
},
In DynamoDB, thanks to fine-grained access
we can allow the access only to the rows
owned by the user (the rows with his userID)
It is also possible restrict the access of the
role to specific columns.
{
"Effect":"Allow",
"Action":[
"dynamodb:GetItem",
"dynamodb:BatchGetItem",
"dynamodb:Query",
"dynamodb:PutItem",
"dynamodb:UploadItem"
],
"Resource":[
"arn:aws:dynamodb:eu-west-1:728936874646:table/finance5",
],
"Condition":{
"ForAllValues:StringEquals":{
"dynamodb:LeadingKeys":[
"${www.amazon.com:user_id}"
]
}
}
}
The data are now protected in the right way.
Each user has access to his data.
The app is now completed.
Simple Storage Service
DynamoDB
IAM
STS
We can create other apps that works with
the data thanks to different policies:
The cloud is perfect for growing projects,
thanks to the scalability of services
and the cost saving
especially in the startup stage.
Thank you!
Any questions?
@gabrielemittica
gabriele.mittica@corley.it

Das könnte Ihnen auch gefallen