Dec 07, 2019

People

News

Software

Publications

Projects

Industry

Laboratory

Search

Main.Laboratory History

Hide minor edits - Show changes to markup

Changed line 10 from:

MapReduce cluster with Hadoop and Nutch deployed is used as a testing environment for developing information processing solutions.

to:

MapReduce cluster with Hadoop and Nutch, which are used as a testing environment for developing information processing solutions.

Changed line 10 from:

MapReduce cluster with Hadoop and Nutch deployed is used as a testing environment. We use it to develop and test our solution for information processing.

to:

MapReduce cluster with Hadoop and Nutch deployed is used as a testing environment for developing information processing solutions.

Changed line 8 from:

Hadoop cluster with 1x server and 12x client. Node specification: 2x Intel® Xeon® Processor E5-2620 (15M Cache, 2.00 GHz, 7.20 GT/s Intel® QPI, 6x cores, 12x threads) + HyperX threading (24 simultaneous tasks per client), 32GB RAM (1 node 48GB), 1TB HDD (2 nodes 500GB). Total storage capacity of the cluster: 11TB.

to:

Hadoop cluster with 1x server and 14x client. Node specification: 2x Intel® Xeon® Processor E5-2620 (15M Cache, 2.00 GHz, 7.20 GT/s Intel® QPI, 6x cores, 12x threads) + HyperX threading (24 simultaneous tasks per client), 32GB RAM (2 nodes 48GB), 1TB HDD (2 nodes 500GB). Total storage capacity of the cluster: 13TB.

Changed line 8 from:

Hadoop cluster with 1x server and 11x client. Node specification: 2x Intel® Xeon® Processor E5-2620 (15M Cache, 2.00 GHz, 7.20 GT/s Intel® QPI, 6x cores, 12x threads) + HyperX threading (24 simultaneous tasks per client), 32GB RAM, 1TB HDD. Total storage capacity of the cluster: 11TB.

to:

Hadoop cluster with 1x server and 12x client. Node specification: 2x Intel® Xeon® Processor E5-2620 (15M Cache, 2.00 GHz, 7.20 GT/s Intel® QPI, 6x cores, 12x threads) + HyperX threading (24 simultaneous tasks per client), 32GB RAM (1 node 48GB), 1TB HDD (2 nodes 500GB). Total storage capacity of the cluster: 11TB.

Added line 7:
Changed line 5 from:

IKT Data Analytics Supercomputer (DAS)

to:

IKT Data Analytics Supercomputer (DAS)

Changed line 5 from:
  • IKT Data Analytics Supercomputer (DAS)*
to:

IKT Data Analytics Supercomputer (DAS)

Added line 5:
  • IKT Data Analytics Supercomputer (DAS)*
Deleted lines 1-3:

HP Blade system: SMART cluster Hadoop cluster with 1x server and 11x client. Node specification: 2x Intel® Xeon® Processor E5-2620 (15M Cache, 2.00 GHz, 7.20 GT/s Intel® QPI, 6x cores, 12x threads) + HyperX threading (24 simultaneous tasks per client), 32GB RAM, 1TB HDD. Total storage capacity of the cluster: 11TB.

Added lines 4-7:

HP Blade system: SMART cluster Hadoop cluster with 1x server and 11x client. Node specification: 2x Intel® Xeon® Processor E5-2620 (15M Cache, 2.00 GHz, 7.20 GT/s Intel® QPI, 6x cores, 12x threads) + HyperX threading (24 simultaneous tasks per client), 32GB RAM, 1TB HDD. Total storage capacity of the cluster: 11TB.

Added lines 2-4:

HP Blade system: SMART cluster Hadoop cluster with 1x server and 11x client. Node specification: 2x Intel® Xeon® Processor E5-2620 (15M Cache, 2.00 GHz, 7.20 GT/s Intel® QPI, 6x cores, 12x threads) + HyperX threading (24 simultaneous tasks per client), 32GB RAM, 1TB HDD. Total storage capacity of the cluster: 11TB.

Changed lines 8-16 from:
  • Apache HBase11
  • Apache Hive12
  • Apache Pig13
  • Cassandra14
  • Riak15
  • Hypertable16
  • CouchDB17
  • MongoDB18
  • Voldemort19
to:
  • Apache HBase
  • Apache Hive
  • Apache Pig
  • Cassandra
  • Riak
  • Hypertable
  • CouchDB
  • MongoDB
  • Voldemort
Changed line 6 from:

Testing frameworks:

to:

Frameworks:

Added line 7:
  • Apache Spark
Changed lines 7-15 from:

• Apache HBase11 • Apache Hive12 • Apache Pig13 • Cassandra14 • Riak15 • Hypertable16 • CouchDB17 • MongoDB18 • Voldemort19

to:
  • Apache HBase11
  • Apache Hive12
  • Apache Pig13
  • Cassandra14
  • Riak15
  • Hypertable16
  • CouchDB17
  • MongoDB18
  • Voldemort19
Changed lines 4-15 from:

MapReduce cluster with Hadoop and Nutch deployed is used as a testing environment. We use it to develop and test our solution for information processing.

to:

MapReduce cluster with Hadoop and Nutch deployed is used as a testing environment. We use it to develop and test our solution for information processing.

Testing frameworks: • Apache HBase11 • Apache Hive12 • Apache Pig13 • Cassandra14 • Riak15 • Hypertable16 • CouchDB17 • MongoDB18 • Voldemort19

Changed line 3 from:

http://ikt.ui.sav.sk/image/hadoop_cluster_small.jpg

to:

http://ikt.ui.sav.sk/image/hadoop_cluster_small.jpg

Changed line 3 from:

http://ikt.ui.sav.sk/image/hadoop_cluster.jpg

to:

http://ikt.ui.sav.sk/image/hadoop_cluster_small.jpg

Changed lines 3-4 from:

http://ups.savba.sk/parcom/pict/P1020159_SMART_small.JPG MapReduce cluster with Hadoop and Nutch deployed is used as a testing environment. We use it to develop and test our solution for information processing.

to:

http://ikt.ui.sav.sk/image/hadoop_cluster.jpg MapReduce cluster with Hadoop and Nutch deployed is used as a testing environment. We use it to develop and test our solution for information processing.

Changed line 3 from:

%rfloat http://ups.savba.sk/parcom/pict/P1020159_SMART_small.JPG

to:

http://ups.savba.sk/parcom/pict/P1020159_SMART_small.JPG

Added line 3:

%rfloat http://ups.savba.sk/parcom/pict/P1020159_SMART_small.JPG

Deleted lines 4-5:

http://ups.savba.sk/parcom/pict/P1020159_SMART_small.JPG

Changed lines 2-3 from:

IISAS testing environment contains of computer cluster, where scalable architectures (MapReduce - Hadoop) are deployed. We use it to develop and test our solution for information processing.

to:

MapReduce cluster with Hadoop and Nutch deployed is used as a testing environment. We use it to develop and test our solution for information processing.

Changed line 1 from:

LABORATORY

to:

Laboratory and Infrastructure

Changed line 1 from:

LABORATORY

to:

LABORATORY

Added lines 1-2:

LABORATORY

Added lines 1-2:

IISAS testing environment contains of computer cluster, where scalable architectures (MapReduce - Hadoop) are deployed. We use it to develop and test our solution for information processing.

Added line 1:

http://ups.savba.sk/parcom/pict/P1020159_SMART_small.JPG

Page Actions

Recent Changes

Group & Page

Back Links