Smart Home GUI February 8, 2017 at 10:23 pm

Now I can operate with some devices on my home via GUI – Awesome!

Hadoop Object Storage – Ozone February 3, 2017 at 12:14 pm

https://wiki.apache.org/hadoop/Ozone

Downloaded last hadoop development source (hadoop-3.0.0-alpha2) switched to HDFS-7240 branch where ozone development is taking place. Build it – success.

 

[ozone@bigdata24 hadoop-3.0.0-alpha2-SNAPSHOT]$ ./bin/hdfs
Usage: hdfs [OPTIONS] SUBCOMMAND [SUBCOMMAND OPTIONS]

OPTIONS is none or any of:

–buildpaths attempt to add class files from build tree
–config dir Hadoop config directory
–daemon (start|status|stop) operate on a daemon
–debug turn on shell script debug mode
–help usage information
–hostnames list[,of,host,names] hosts to use in worker mode
–hosts filename list of hosts to use in worker mode
–loglevel level set the log4j level for this command
–workers turn on worker mode

SUBCOMMAND is one of:

balancer run a cluster balancing utility
cacheadmin configure the HDFS cache
classpath prints the class path needed to get the hadoop jar and the required libraries
crypto configure HDFS encryption zones
datanode run a DFS datanode
debug run a Debug Admin to execute HDFS debug commands
dfsadmin run a DFS admin client
dfs run a filesystem command on the file system
diskbalancer Distributes data evenly among disks on a given node
envvars display computed Hadoop environment variables
erasurecode run a HDFS ErasureCoding CLI
fetchdt fetch a delegation token from the NameNode
fsck run a DFS filesystem checking utility
getconf get config values from configuration
groups get the groups which users belong to
haadmin run a DFS HA admin client
jmxget get JMX exported values from NameNode or DataNode.
journalnode run the DFS journalnode
lsSnapshottableDir list all snapshottable dirs owned by the current user
mover run a utility to move block replicas across storage types
namenode run the DFS namenode
nfs3 run an NFS version 3 gateway
oev apply the offline edits viewer to an edits file
oiv apply the offline fsimage viewer to an fsimage
oiv_legacy apply the offline fsimage viewer to a legacy fsimage
oz command line interface for ozone
portmap run a portmap service
scm run the Storage Container Manager service
secondarynamenode run the DFS secondary namenode
snapshotDiff diff two snapshots of a directory or diff the current directory contents with a snapshot
storagepolicies list/get/set block storage policies
version print the version
zkfc run the ZK Failover Controller daemon

 

As you can see new fancy attributes like oz and scm are there.

[ozone@bigdata24 hadoop-3.0.0-alpha2-SNAPSHOT]$ bin/hdfs oz
ERROR: oz is not COMMAND nor fully qualified CLASSNAME.

[ozone@bigdata24 hadoop-3.0.0-alpha2-SNAPSHOT]$ bin/hdfs scm
Error: Could not find or load main class

No luck. I was out of ideas so wrote to hadoop users list. No answers. After it I tried hadoop developers list and got help:

Hi Margus,

It looks like there might have been some error when merging trunk into HDFS-7240, which mistakenly
changed some entries in hdfs script. Thanks for the catch!

We will update the branch to fix it. In the meantime, as a quick fix, you can apply the attached
patch file and re-compile, OR do the following manually:

1. open hadoop-hdfs-project/hadoop-hdfs/src/main/bin/hdfs
2. between
oiv_legacy)
       HADOOP_CLASSNAME=org.apache.hadoop.hdfs.tools.offlineImageViewer.OfflineImageViewer
     ;;
 and
portmap)
       HADOOP_SUBCMD_SUPPORTDAEMONIZATION="true"
       HADOOP_CLASSNAME=org.apache.hadoop.portmap.Portmap
     ;;
add
oz) 
    HADOOP_CLASSNAME=org.apache.hadoop.ozone.web.ozShell.Shell 
;;
3. change this line
CLASS='org.apache.hadoop.ozone.storage.StorageContainerManager'
to
HADOOP_CLASSNAME='org.apache.hadoop.ozone.storage.StorageContainerManager'
4. re-compile.


rebuild it and it helped.

Lets try to play whit a new toy.

[ozone@bigdata24 hadoop-3.0.0-alpha2-SNAPSHOT]$ ./bin/hdfs oz -v -createVolume http://127.0.0.1:9864/margusja -user ozone -quota 10GB -root
Volume name : margusja
{
 "owner" : {
 "name" : "ozone"
 },
 "quota" : {
 "unit" : "GB",
 "size" : 10
 },
 "volumeName" : "margusja",
 "createdOn" : "Fri, 03 Feb 2017 10:13:39 GMT",
 "createdBy" : "hdfs"
}

[ozone@bigdata24 hadoop-3.0.0-alpha2-SNAPSHOT]$ ./bin/hdfs oz -createBucket http://127.0.0.1:9864/margusja/demo -user ozone -v
Volume Name : margusja
Bucket Name : demo
{
 "volumeName" : "margusja",
 "bucketName" : "demo",
 "acls" : null,
 "versioning" : "DISABLED",
 "storageType" : "DISK"
}

[ozone@bigdata24 hadoop-3.0.0-alpha2-SNAPSHOT]$ ./bin/hdfs oz -v -putKey http://127.0.0.1:9864/margusja/demo/key001 -file margusja.txt
Volume Name : margusja
Bucket Name : demo
Key Name : key001
File Hash : 4273b3664fcf8bd89fd2b6d25cdf64ae


[ozone@bigdata24 hadoop-3.0.0-alpha2-SNAPSHOT]$ ./bin/hdfs oz -v -putKey http://127.0.0.1:9864/margusja/demo/key002 -file margusja2.txt
Volume Name : margusja
Bucket Name : demo
Key Name : key002

[ozone@bigdata24 hadoop-3.0.0-alpha2-SNAPSHOT]$ ./bin/hdfs oz -v -listKey http://127.0.0.1:9864/margusja/demo/
Volume Name : margusja
bucket Name : demo
{
 "version" : 0,
 "md5hash" : "4273b3664fcf8bd89fd2b6d25cdf64ae",
 "createdOn" : "Fri, 03 Feb 2017 12:25:43 +0200",
 "size" : 21,
 "keyName" : "key001"
}
{
 "version" : 0,
 "md5hash" : "4273b3664fcf8bd89fd2b6d25cdf64ae",
 "createdOn" : "Fri, 03 Feb 2017 12:26:14 +0200",
 "size" : 21,
 "keyName" : "key002"
}
[ozone@bigdata24 hadoop-3.0.0-alpha2-SNAPSHOT]$


To compare with filesystem we created directory /margusja after it created subdirectory margusja/demo and finally added two files to margusja/demo/. 
So the picture is smth like

/margusja (volume)
/margusja/demo (bucket)
/margusja/demo/margusja.txt (key001)
/margusja/demo/margusja2.txt (key002)

Kodune el. boiler on võrgus December 11, 2016 at 10:45 pm

sonoff pow to Sonoff-MQTT-OTA-Arduino November 27, 2016 at 11:41 am

Hiinlased on tulnud välja päris taskukohase tükiga – https://www.itead.cc/sonoff-pow.html. Tegemis on wifi kaudu lülitatava releega (230v/16A) piisav enamus kodumajapidamises ühefaasiliste jubinate kontrollimiseks.

2016-11-27-10-24-33

Kui nüüd jubin lahti võtta (küsimusele: “Miks peaks?” otsige vastust raamatust “Hackers: Heroes of the Computer Revolution” by S. Levy), siis leiame sealt huvipakkuva pordi:

2016-11-27-11-05-31

GND ja VDD vahele lähevad veel serial RX ja TX.

Loodus tühja kohta ei salli. Github’ist leiab projekti https://github.com/arendst/Sonoff-MQTT-OTA-Arduino. Tänud Ull’le (alias Märt Maiste), kes need kaks asja mul kokku aitas panna.

Edasi on lihtne. Tuleb github projekt alla laadida. Kokku lasta ja jubina sisse lasta. Kuna mul parajasti ühtegi töökorras FTDI plaati ei olnud, siis aitas arduino plaat hädast välja.

2016-11-27-10-34-08

screen-shot-2016-11-27-at-10-38-44

Kui nüüd jubin kenasti vooluvõrku panna ja muud seadistused teha, siis peaks kodusest DHCP serverist saama ta IP ja avades selle IP veebilehitsejas peaks avanema pilt:

screen-shot-2016-11-27-at-11-21-27

Kõnealune jubin toetab MQTT protokolli, mis annab väga vajaliku kihi raud- ja tarkvara vahele.

Mina paigaldasin raspberry pi peale mosquitto MQTT serveri (tnx Ull vihje eest). Nüüd on võimalik MQTT sub käsuga kuulata jubina staatust. Näiteks kas ta on sisse lülitatud, pinget, voolu tarbimist ja palju muud veel. Kõike seda saab ka veebiliidese kaudu

screen-shot-2016-11-27-at-11-26-29

screen-shot-2016-11-27-at-11-29-59

 

Kui nüüd WAN port suunata raspberry 22 porti, saab (juhul kui internet on olemas ja kodus ka LAN peal kõik toimib) kontrollida eemalt oma jubinaid

screen-shot-2016-11-27-at-11-36-25

Lisaks peaks kogu see kompott kokku istuma OpenHub projektiga.

Basys2 with external clock October 24, 2016 at 1:41 pm

Recently discovered that internal build in clock is quite unstable. Making a simple stopper was quite unsharp.

Added external clock (25Mz) and the picture is much better.

20161023_115610

Image to matrix using ImageMagick October 3, 2016 at 12:31 pm

I have example picture
nr_7

Now I converted it in to 28×28 image

screen-shot-2016-10-03-at-13-30-45

 

 

ImageMagick:

Original:

[margusja@bigdata18 ~]$ identify seven_28_28.png
seven_28_28.png PNG 527×524 527×524+0+0 8-bit DirectClass 13KB 0.000u 0:00.000

[margusja@bigdata18 ~]$ convert seven_28_28.png -resize 28×28 seven_28_28_new.png

 

28×28 pixel image

[margusja@bigdata18 ~]$ identify seven_28_28_new.png
seven_28_28_new.png PNG 28×28 28×28+0+0 8-bit DirectClass 1.7KB 0.000u 0:00.000

Take image as a text

[margusja@bigdata18 ~]$ convert seven_28_28_new.png txt:-

0,0: (209,203,196,255) #D1CBC4 srgba(209,203,196,1)
1,0: (210,204,197,255) #D2CCC5 srgba(210,204,197,1)
2,0: (209,203,195,255) #D1CBC3 srgba(209,203,195,1)
3,0: (207,201,193,255) #CFC9C1 srgba(207,201,193,1)
4,0: (206,202,194,255) #CECAC2 srgba(206,202,194,1)
5,0: (210,205,197,255) #D2CDC5 srgba(210,205,197,1)
6,0: (210,205,198,255) #D2CDC6 srgba(210,205,198,1)
7,0: (203,198,195,255) #CBC6C3 srgba(203,198,195,1)
8,0: (191,187,189,255) #BFBBBD srgba(191,187,189,1)
9,0: (186,182,184,255) #BAB6B8 srgba(186,182,184,1)
10,0: (203,196,192,255) #CBC4C0 srgba(203,196,192,1)
11,0: (208,202,194,255) #D0CAC2 srgba(208,202,194,1)
12,0: (207,202,194,255) #CFCAC2 srgba(207,202,194,1)
13,0: (209,204,196,255) #D1CCC4 srgba(209,204,196,1)
14,0: (210,205,196,255) #D2CDC4 srgba(210,205,196,1)
15,0: (210,204,196,255) #D2CCC4 srgba(210,204,196,1)
16,0: (211,204,196,255) #D3CCC4 srgba(211,204,196,1)
17,0: (211,205,196,255) #D3CDC4 srgba(211,205,196,1)
18,0: (212,205,197,255) #D4CDC5 srgba(212,205,197,1)
19,0: (212,205,197,255) #D4CDC5 srgba(212,205,197,1)
20,0: (211,205,196,255) #D3CDC4 srgba(211,205,196,1)
21,0: (214,206,198,255) #D6CEC6 srgba(214,206,198,1)
22,0: (215,208,199,255) #D7D0C7 srgba(215,208,199,1)
23,0: (213,206,198,255) #D5CEC6 srgba(213,206,198,1)
24,0: (213,206,198,255) #D5CEC6 srgba(213,206,198,1)
25,0: (213,206,198,255) #D5CEC6 srgba(213,206,198,1)
26,0: (213,206,198,255) #D5CEC6 srgba(213,206,198,1)
27,0: (215,209,201,255) #D7D1C9 srgba(215,209,201,1)

0,27: (198,194,192,255) #C6C2C0 srgba(198,194,192,1)
1,27: (199,195,192,255) #C7C3C0 srgba(199,195,192,1)
2,27: (199,196,191,255) #C7C4BF srgba(199,196,191,1)
3,27: (195,193,189,255) #C3C1BD srgba(195,193,189,1)
4,27: (194,193,189,255) #C2C1BD srgba(194,193,189,1)
5,27: (196,194,191,255) #C4C2BF srgba(196,194,191,1)
6,27: (200,198,195,255) #C8C6C3 srgba(200,198,195,1)
7,27: (201,199,195,255) #C9C7C3 srgba(201,199,195,1)
8,27: (201,199,196,255) #C9C7C4 srgba(201,199,196,1)
9,27: (202,201,198,255) #CAC9C6 srgba(202,201,198,1)
10,27: (203,202,198,255) #CBCAC6 srgba(203,202,198,1)
11,27: (203,202,198,255) #CBCAC6 srgba(203,202,198,1)
12,27: (203,201,197,255) #CBC9C5 srgba(203,201,197,1)
13,27: (203,202,198,255) #CBCAC6 srgba(203,202,198,1)
14,27: (204,203,198,255) #CCCBC6 srgba(204,203,198,1)
15,27: (205,204,199,255) #CDCCC7 srgba(205,204,199,1)
16,27: (204,202,198,255) #CCCAC6 srgba(204,202,198,1)
17,27: (204,203,199,255) #CCCBC7 srgba(204,203,199,1)
18,27: (204,203,199,255) #CCCBC7 srgba(204,203,199,1)
19,27: (203,201,197,255) #CBC9C5 srgba(203,201,197,1)
20,27: (203,200,196,255) #CBC8C4 srgba(203,200,196,1)
21,27: (204,200,197,255) #CCC8C5 srgba(204,200,197,1)
22,27: (204,201,198,255) #CCC9C6 srgba(204,201,198,1)
23,27: (204,202,198,255) #CCCAC6 srgba(204,202,198,1)
24,27: (206,203,198,255) #CECBC6 srgba(206,203,198,1)
25,27: (208,204,200,255) #D0CCC8 srgba(208,204,200,1)
26,27: (207,204,200,255) #CFCCC8 srgba(207,204,200,1)
27,27: (209,207,203,255) #D1CFCB srgba(209,207,203,1)

 

More easily working data

[margusja@bigdata18 ~]$ convert seven_28_28_new.png -colorspace gray seven_28_28_gray.png

[margusja@bigdata18 ~]$ convert seven_28_28_gray.png txt:- | more | awk {‘print $1 ” ” substr($2, 2, 3)’}

0,0: 154
1,0: 156
2,0: 154
3,0: 150
4,0: 151
5,0: 157
6,0: 157
7,0: 146
8,0: 129
9,0: 121
10,0: 143
11,0: 152
12,0: 152
13,0: 155
14,0: 157
15,0: 156
16,0: 156
17,0: 157
18,0: 158
19,0: 158
20,0: 157
21,0: 160
22,0: 163
23,0: 160
24,0: 160
25,0: 160
26,0: 160
27,0: 164

0,27: 139
1,27: 141
2,27: 141
3,27: 136
4,27: 136
5,27: 138
6,27: 144
7,27: 146
8,27: 146
9,27: 149
10,27: 150
11,27: 150
12,27: 149
13,27: 150
14,27: 152
15,27: 154
16,27: 151
17,27: 152
18,27: 152
19,27: 149
20,27: 148
21,27: 149
22,27: 150
23,27: 151
24,27: 153
25,27: 155
26,27: 155
27,27: 159

Apache Spark image recognition result – not bad at 8:45 am


evaluator: org.apache.spark.ml.evaluation.MulticlassClassificationEvaluator = mcEval_a147352f3495
precision: Double = 0.9735973597359736
confusionMatrix: org.apache.spark.sql.DataFrame = [label: double, 0.0: bigint, 1.0: bigint, 2.0: bigint, 3.0: bigint, 4.0: bigint, 5.0: bigint, 6.0: bigint, 7.0: bigint, 8.0: bigint, 9.0: bigint]
Confusion Matrix (Vertical: Actual, Horizontal: Predicted):
+-----+---+----+----+---+---+---+---+---+---+---+
|label|0.0| 1.0| 2.0|3.0|4.0|5.0|6.0|7.0|8.0|9.0|
+-----+---+----+----+---+---+---+---+---+---+---+
| 0.0|961| 0| 3| 2| 1| 4| 5| 2| 1| 1|
| 1.0| 0|1125| 4| 0| 0| 1| 1| 2| 2| 0|
| 2.0| 3| 2|1005| 5| 1| 1| 2| 4| 9| 0|
| 3.0| 0| 0| 3|992| 0| 1| 0| 4| 6| 4|
| 4.0| 2| 0| 4| 1|953| 1| 3| 3| 2| 13|
| 5.0| 6| 0| 0| 15| 1|858| 5| 1| 4| 2|
| 6.0| 4| 2| 3| 0| 5| 8|936| 0| 0| 0|
| 7.0| 0| 5| 9| 3| 1| 0| 0|992| 1| 16|
| 8.0| 3| 0| 4| 6| 2| 6| 3| 5|944| 1|
| 9.0| 2| 2| 2| 10| 11| 2| 2| 6| 3|969|
+-----+---+----+----+---+---+---+---+---+---+---+

About visualization. One way to present results is above. But much more nicer is in example graph from apache-zeppelin sql node:

screen-shot-2016-10-05-at-13-17-55

What we can see from picture. In example we are on number zero and we can see that number of correct predictions is 961. Also there is a list of wrongly predicted numbers.

In the next picture when we hover over 5 we can see there is red area top of it. As you can see red in this picture means number 5. So there are 15 wrongly predicted numbers as number 3. And it is expected because numbers 5 and 3 are quite similar for machine.

screen-shot-2016-10-05-at-13-26-32

R csv to libcsv September 30, 2016 at 1:07 pm

library(“e1071″, lib.loc=”~/Library/R/3.3/library”)
library(“SparseM”, lib.loc=”~/Library/R/3.3/library”)
data <- read.csv(‘/Users/margusja/Downloads/mnist_test.csv’)

dim(data)

x <- as.matrix(data[,2:785])
y <- data[,1]

xs <- as.matrix.csr(x)
write.matrix.csr(xs, y =y, file=”test.txt”)

Apache Spark hints September 28, 2016 at 1:22 pm

scala> val data = sc.textFile(“hdfs://path/to/file”)

scala> data.foreach(println) // print all lines from file

scala> def myPrint (a: String) : Unit = {println(a))

scala> data.foreach(a => myPrint(a)) // prints all lines from file using myPrint function

 

scala> case class EmailRow(row:String) // create class for row

scala> val df=data.map(x => EmailRow(x) ).toDF() // Create dataframe

// show dataframe

scala> df.show()

scala> df.select(“row”).show()

df.foreach(println)

 

// Create unique id column for dataset

scala> import org.apache.spark.sql.functions.monotonicallyIncreasingId

scala> val newDf = df.withColumn(“id”, monotonicallyIncreasingId) // adds a new columnt at the end of the current dataset

scala> val ds2 = newDf.select(“id”,”row”) // now id is the first columnt

scala> ds2.select(“id”, “row”).where(df(“row”).contains(“X-“)).show() //filter out smth and show it

scala> ds2.count() // how many lines do I have in my dataset

 

val text_file = sc.textFile(“hdfs://bigdata21.webmedia.int:8020/user/margusja/titanic_test.csv”)
//text_file.map(_.length).collect
//text_file.flatMap(_.split(“,”)).collect
// word (as a key), 1
text_file.flatMap(_.split(“,”)).map((_,1)).reduceByKey(_+_)

case class Person(name: String, age: String)
val people = text_file.map(_.split(“,”)).map(p => Person(p(2), p(5))).toDS().toDF()

// Age is String and contains empty fields. Lets filter out numerical values
people.filter($”age” > 0).select(people(“age”).cast(“int”)).show()

// lets take avarage of people age
people.filter($”age” > 0).select(avg((people(“age”).cast(“int”)))).show()

Create function in Apache Spark at 1:15 pm

scala> def myPrint (a: String) : Unit = {println(a)}
myPrint: (a: String)Unit

scala> myPrint(“Tere maailm”)
Tere maailm