Deploying Diksha
In introducing diksha the features of Diksha were demonstrated. This article discusses how to deploy diksha. All actions in this section will incur costs. These instructions are provided without any guarantees.
Diksha relies on DynamoDB and SWF from Amazon.
Figure 1 shows the most simple deployment of diksha on one server. diksha-client is the command-line interface through which the different users will interact with diksha. diksha-server is the workhorse of diksha; keeping track of different scheduled jobs. There is no need for diksha-server to be on the same machine as user. Also there can be more than one diksha-server; in which case the load would be automatically distributed between different servers.
There are, currently, two ways of getting diksha.
(a) Download the two jars from the latest snapshot (currently 0.0.1)
(b) Compile from source
git clone https://github.com/milindparikh/diksha.git
mvn clean
mvn install
You must setup a SWF domain, certain tables in DynamoDB and one entry in one table before you can actually start to use diksha. All of the interactions with diksha by a user are expected to occur through diksha-client. It is expected that you are familiar with the security model of AWS.
As an administator of an AWS account, it is possible to complete the entire setup and running of diksha through the command line as a single user.
1. Your environment variables must be setup.
2. Create domain, dynamodb tables and a config entry in one table
Figure 1 |
Figure 1 shows the most simple deployment of diksha on one server. diksha-client is the command-line interface through which the different users will interact with diksha. diksha-server is the workhorse of diksha; keeping track of different scheduled jobs. There is no need for diksha-server to be on the same machine as user. Also there can be more than one diksha-server; in which case the load would be automatically distributed between different servers.
There are, currently, two ways of getting diksha.
(a) Download the two jars from the latest snapshot (currently 0.0.1)
(b) Compile from source
git clone https://github.com/milindparikh/diksha.git
mvn clean
mvn install
Setup
You must setup a SWF domain, certain tables in DynamoDB and one entry in one table before you can actually start to use diksha. All of the interactions with diksha by a user are expected to occur through diksha-client. It is expected that you are familiar with the security model of AWS.
If you are an administrator of an AWS account :
As an administator of an AWS account, it is possible to complete the entire setup and running of diksha through the command line as a single user.
1. Your environment variables must be setup.
export AWS_ACCESS_KEY_ID=YOURKEYID
export AWS_SECRET_ACCESS_KEY=YOURKEY
2. Create domain, dynamodb tables and a config entry in one table
The short way
java -jar diksha-client-<SNAPSHOT>.jar -adminit
The long way
{ =============== begin long way ================
Diksha Admin
Step 1 : Decide on a domain name
create the domain name using admcd
// domainName|domainDescription|workflowretentionperiodinday
-admcd "diksha|dikshadomain|1"
Step 2: Create the supporting dynamodb tables
-admcdt "SchedulerWorkflowState,1,1,clientId,S,loopState,G:S:1:1"
-admcdt "SchedulerUDF,1,1,functionAlias,S,,"
-admcdt "SchedulerUDJ,1,1,jobName,S,,"
-admcdt "SchedulerUDE,1,1,executionId, S,,"
-admcdt "SchedulerConfig,1,1,configId, S,,"
-admcdt "SchedulerUDE,1,1,executionId,
-admcdt "SchedulerConfig,1,1,configId,
Step 3: Create a configuration through ccfg
//configid|endPoint|domain|socketTimeout|taskList
//cf1|https://swf.us-east-1.amazonaws.com|diksha|70000|HelloWorldList
//cf1|https://swf.us-east-1.amazonaws.com|diksha|70000|HelloWorldList
=============== end long way ================ }
If you are NOT an administrator of an AWS account :
You must ask the an IAM administrator create the following
roles in the AWS account
1. diksha-admin
2. diksha-designer
3. diksha-user
4. diksha-workflow
The admin is associated with creating
(i) the domain in Simple Work Flow
(ii) the tables in DynamoDB
(iii) the initial config
The designer is associated with creating
(i) functionAlias
(ii) jobs
The user is associated with
(i) running different jobs
(ii) seeing status of different jobs
(iii) sometimes canceling jobs
The workflow is associated with the execution of the different jobs on schedule as requested
As an IAM administrator, the following command is available to generate the security policies
java -jar diksha-client-<SNAPSHOT>.jar -adminitsps <awsaccountnbr>
Once the designated policies are associated with the relevant users, the diksha-admin user can use the short or the long way exactly like the AWS administrator. The diksha-admin user has limited priviledges as compared to AWS administrator.
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "swf:*", "Resource": "arn:aws:swf:*:123456789012:/domain/dikshaDomain" },{ "Effect": "Allow", "Action": "dynamodb:*", "Resource": "arn:aws:dynamodb:*:123456789012:*" } ] }
YOU ARE NOW READY TO USE diksha !
USAGE
Diksha Engine.
You must run diksha-engine as an AWS Administrator OR as a user who has access to the diksha-workflow policy. You must run the diksha-engine on at least one server; but you can run multiple diksha-engines on multiple-servers. The load will be (evenly?) distributed between different multiple servers
java -jar diksha-engine-0.0.1.jar
Diksha Designer
Function Aliases
-lcfg cf1 -cf "cool|L|arn:aws:lambda:us-east-1:123456789012:function:echocool"
creates an alias to the function "cool|L|arn:aws:lambda:us-east-1:123456789012:function:echocool" called "cool"
creates an alias to the function "cool|L|arn:aws:lambda:us-east-1:123456789012:function:echocool" called "cool"
Predefined Jobs
-lcfg cf1 -cj "runcooljobeverymin|cool|contextmin|0 0-59 * * * *|2"
Diksha User
Running jobs
-lcfg cf1 -cj "runcooljobeverymin"
Security Policies
The security policies are referenced here for details. Of course your account number would be different
Admin
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "swf:*", "Resource": "arn:aws:swf:*:123456789012:/domain/dikshaDomain" },{ "Effect": "Allow", "Action": "dynamodb:*", "Resource": "arn:aws:dynamodb:*:123456789012:*" } ] }
Designer
{ "Version": "2012-10-17", "Statement": { "Effect": "Allow", "Action": [ "dynamodb:GetItem","dynamodb:BatchGetItem","dynamodb:Query","dynamodb:Scan","dynamodb:PutItem","dynamodb:UpdateItem","dynamodb:DeleteItem","dynamodb:BatchWriteItem" ] , "Resource": [ "arn:aws:dynamodb:*:123456789012:table/SchedulerUDF","arn:aws:dynamodb:*:123456789012:table/SchedulerUDF/index/*","arn:aws:dynamodb:*:123456789012:table/SchedulerUDJ","arn:aws:dynamodb:*:123456789012:table/SchedulerUDJ/index/*" ] } }
User
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "swf:CountOpenWorkflowExecutions","swf:CountClosedWorkflowExecutions","swf:DescribeActivityType","swf:DescribeDomain","swf:DescribeWorkflowExecution","swf:DescribeWorkflowType","swf:GetWorkflowExecutionHistory","swf:ListActivityTypes","swf:ListClosedWorkflowExecutions","swf:ListOpenWorkflowExecutions","swf:RequestCancelWorkflowExecution","swf:SignalWorkflowExecution","swf:StartWorkflowExecution","swf:TerminateWorkflowExecution" ] , "Resource": "arn:aws:swf:*:123456789012:/domain/dikshaDomain" },{ "Effect": "Allow", "Action": [ "dynamodb:GetItem","dynamodb:BatchGetItem","dynamodb:Query","dynamodb:Scan" ] , "Resource": [ "arn:aws:dynamodb:*:123456789012:table/SchedulerConfig","arn:aws:dynamodb:*:123456789012:table/SchedulerConfig/index/*","arn:aws:dynamodb:*:123456789012:table/SchedulerUDF","arn:aws:dynamodb:*:123456789012:table/SchedulerUDF/index/*","arn:aws:dynamodb:*:123456789012:table/SchedulerUDJ","arn:aws:dynamodb:*:123456789012:table/SchedulerUDJ/index/*","arn:aws:dynamodb:*:123456789012:table/SchedulerWorkflowState","arn:aws:dynamodb:*:123456789012:table/SchedulerWorkflowState/index/*" ] },{ "Effect": "Allow", "Action": [ "dynamodb:GetItem","dynamodb:BatchGetItem","dynamodb:Query","dynamodb:Scan","dynamodb:PutItem","dynamodb:UpdateItem","dynamodb:DeleteItem","dynamodb:BatchWriteItem" ] , "Resource": [ "arn:aws:dynamodb:*:123456789012:table/SchedulerUDE","arn:aws:dynamodb:*:123456789012:table/SchedulerUDE/index/*" ] } ] }
Workflow
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "swf:*", "Resource": "arn:aws:swf:*:123456789012:/domain/dikshaDomain" },{ "Effect": "Allow", "Action": [ "dynamodb:GetItem","dynamodb:BatchGetItem","dynamodb:Query","dynamodb:Scan" ] , "Resource": [ "arn:aws:dynamodb:*:123456789012:table/SchedulerConfig","arn:aws:dynamodb:*:123456789012:table/SchedulerConfig/index/*" ] },{ "Effect": "Allow", "Action": [ "dynamodb:GetItem","dynamodb:BatchGetItem","dynamodb:Query","dynamodb:Scan","dynamodb:PutItem","dynamodb:UpdateItem","dynamodb:DeleteItem","dynamodb:BatchWriteItem" ] , "Resource": [ "arn:aws:dynamodb:*:123456789012:table/SchedulerUDE","arn:aws:dynamodb:*:123456789012:table/SchedulerUDE/index/*","arn:aws:dynamodb:*:123456789012:table/SchedulerWorkflowState","arn:aws:dynamodb:*:123456789012:table/SchedulerWorkflowState/index/*" ] },{ "Effect": "Allow", "Action": "lambda:InvokeFunction", "Resource": "arn:aws:lambda:*:123456789012:*:*" } ] }
No comments:
Post a Comment