Skip to content
This repository has been archived by the owner on Jun 27, 2023. It is now read-only.

Commit

Permalink
Merge pull request #39 from thiagocoutinhor/v0.1.3-alpha
Browse files Browse the repository at this point in the history
v0.1.3-alpha
  • Loading branch information
thiagocoutinhor authored Apr 5, 2020
2 parents 9ebbb4b + c43581f commit db94845
Show file tree
Hide file tree
Showing 22 changed files with 478 additions and 121 deletions.
3 changes: 2 additions & 1 deletion .env
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
MODE=PRODUCTION
LOG_LEVEL=INFO
PORT=9099
LOGIN_TYPE=PASSWORD
SPARK_HOST=hdcpx02.interno
MONGO=mongodb://notebook:burning-book@localhost:27017
USER_BLACKLIST=sods3001,phdpdig
SPARK_QUEUE=root.digital.users
SPARK_LIBRARIES=/data2/digital_stage/app/production/scala/Load_digital/lib/commons-csv-1.2.jar,/data2/digital_stage/app/production/scala/Load_digital/lib/spark-csv_2.10-1.5.0.jar
SPARK_LIBRARIES=/data2/digital_stage/app/production/scala/Load_digital/lib/commons-csv-1.2.jar,/data2/digital_stage/app/production/scala/Load_digital/lib/spark-csv_2.10-1.5.0.jar
24 changes: 24 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,27 @@
v0.1.3-alpha
===============================================================================

### Bug Fixes
- Corrects the progress bar calculation (was adding the numbers in parentesis
before)
- Progress bars are always marked as done when the chunk is done running
- Fixes the "run all above" bug where once activatede always runs to the end
- Fixes the progress bar bug when there where no "=" or ">" character
- Fixes the receit not following the chunk on move up and down

### Quality of Life
- Execution tags on the chunks
- Adds formatted tables to the result card
- Naming chunks now is possible
- Spark configuration is now saved in each book
- Download of the code now is possible
- When running all above the screen jumps to the receipt of the running chunk
- Run all above now mark all the chunks above as running
- Login using SSH Identity file

### Internal
- Protects the user password in the session store

v0.1.2-alpha
===============================================================================

Expand Down
7 changes: 7 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,12 @@ services:
MONGO_INITDB_ROOT_PASSWORD: book
```
## Connecting with identity files
If you need a identity file to login in your spark host (AWS users, for example), set the
enviroment variable `LOGIN_TYPE` to `SSH`. Changing it to `SSH` will make the login screen ask
for your identity file during the login.

## Enviroment Variables

| Variable | Default | Meaning |
Expand All @@ -58,3 +64,4 @@ services:
| USER_BLACKLIST | | Users to be denied access |
| SPARK_QUEUE | | Default queue of all new spark-shell sessions |
| SPARK_LIBRARIES | | Libraries in SPARK_HOST to be used |
| LOGIN_TYPE | PASSWORD | Type of login, between PASSWORD and SSH |
8 changes: 7 additions & 1 deletion api/book/book-model.js
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,16 @@ const newName = 'Novo Book'
const schema = new mongoose.Schema({
name: { type: String, required: true },
commands: [{
name: String,
command: String
}],
owner: { type: String, required: true },
sharedWith: [String]
sharedWith: [String],
sparkConfig: {
executors: Number,
cores: Number,
memory: Number
}
})

schema.index({
Expand Down
20 changes: 20 additions & 0 deletions api/crypt/password-utils.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
const crypto = require('crypto')

// Password cryptografy
const key = crypto.randomBytes(32)
const iv = crypto.randomBytes(16)

function crush(password) {
const cypher = crypto.createCipheriv('aes-256-ctr', key, iv)
return Buffer.concat([cypher.update(password), cypher.final()]).toString('hex')
}

function uncrush(password) {
const decript = crypto.createDecipheriv('aes-256-ctr', key, iv)
return Buffer.concat([decript.update(Buffer.from(password, 'hex')), decript.final()]).toString()
}

module.exports = {
crush,
uncrush
}
34 changes: 32 additions & 2 deletions api/socket/book-socket.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
const Book = require('../book/book-model').Book
const books = {} // Controle para evitar dissincronia dos books entre sockets
// Controle para evitar dissincronia dos books entre sockets
// TODO impede o crescimento horizontal da aplicação. Rever talvez seja necessário
const books = {}

module.exports = socket => {
const usuario = socket.handshake.session.usuario
Expand All @@ -26,7 +28,7 @@ module.exports = socket => {
console.warn(`[IO BOOK - ${usuario.login}] Tentativa de acesso a um book que não existe ${bookId}`)
socket.emit('exit')
socket.disconnect()
returns
return
}

if (!temAcesso(book)) {
Expand Down Expand Up @@ -78,6 +80,34 @@ module.exports = socket => {
socket.broadcast.to(bookId).emit('update', index, command)
})

socket.on('chunk.name', (index, name) => {
book.commands[index].name = name
book.save()
socket.broadcast.to(bookId).emit('chunk.name', index, name)
})

socket.on('chunk.move', (source, destination) => {
const comSource = book.commands[source]
const comDestination = book.commands[destination]
book.commands[source] = comDestination
book.commands[destination] = comSource
book.markModified('commands')
book.save()
console.log(comDestination, book.commands[source])
console.log(`move ${source} > ${destination}`)
socket.broadcast.to(bookId).emit('chunk.move', source, destination)
})

socket.on('spark.config', (executors, cores, memory) => {
book.sparkConfig = {
executors,
cores,
memory
}
book.save()
socket.broadcast.to(bookId).emit('spark.config', executors, cores, memory)
})

socket.on('disconnect', () => {
console.info(`[IO BOOK - ${usuario.login}] Desconectou`)
books[bookId].count--
Expand Down
3 changes: 3 additions & 0 deletions api/socket/spark-socket.js
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
const SparkShell = require('../spark-shell/spark-shell').SparkSession
const Stream = require('stream').PassThrough
const passwordUtils = require('../crypt/password-utils')

module.exports = socket => {
const usuario = socket.handshake.session.usuario
usuario.senha = passwordUtils.uncrush(usuario.senha)

const config = socket.handshake.query
console.debug(`[IO SPARK - ${usuario.login}] Conectou`)

Expand Down
10 changes: 6 additions & 4 deletions api/spark-shell/spark-shell-stub.js
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ const { PassThrough } = require('stream')

// Configuração do mock para facil mudança
const config = {
shellOpenTime: 5 * 1000,
shellOpenTime: 2 * 1000,
mockRunCommand: (user, comando, stream) => {
console.log(`[SPARK MOCK - ${user}] Run recieved\n${comando}`)

Expand All @@ -18,8 +18,9 @@ const config = {

// Quantos contadores e em que intervalo enviar
const stages = [
{ step: 10, progresso: 0 },
{ step: 15, progresso: 0 }
{ step: 50, progresso: 0 },
{ step: 30, progresso: 0 },
{ step: 40, progresso: 0, incompleto: true }
]

// Envia os contadores de progresso a cada segundo
Expand All @@ -28,7 +29,8 @@ const config = {
const timer = setInterval(() => {
stage.progresso += stage.step
stream.emit('data', progress(stage.progresso, index))
if (stage.progresso >= 100) {
const parar = (stage.incompleto && stage.progresso >= 80) || (stage.progresso >= 100)
if (parar) {
finalizado++
clearInterval(timer)
if (finalizado == stages.length) {
Expand Down
15 changes: 12 additions & 3 deletions api/spark-shell/spark-shell.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
const Ssh = require('ssh2-promise')

const LOGIN_TYPE = process.env.LOGIN_TYPE ? process.env.LOGIN_TYPE : 'PASSWORD'

// Classe responsável pela conexão e criação de uma nova sessão do spark
class SparkSession {

Expand All @@ -14,12 +16,19 @@ class SparkSession {
this.__user = user
console.debug(`[SPARK - ${this.__user}] Command:\n\t${this.__startCommand}`)

this.ssh = new Ssh({
const parameters = {
host: process.env.SPARK_HOST,
username: user.toLowerCase(),
password: password,
keepaliveInterval: 60 * 1000
})
}

if (LOGIN_TYPE === 'PASSWORD') {
parameters.password = password
} else if (LOGIN_TYPE === 'SSH') {
parameters.privateKey = password
}

this.ssh = new Ssh(parameters)
}

connect() {
Expand Down
1 change: 1 addition & 0 deletions dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ WORKDIR /app
RUN rm /app/.env
RUN npm i --production
ENV LOG_LEVEL=INFO
ENV LOGIN_TYPE=PASSWORD
ENV SPARK_HOST=localhost
ENV MONGO=mongodb://localhost:27017
ENV USER_BLACKLIST=
Expand Down
24 changes: 24 additions & 0 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@
"d3": "^5.15.0",
"dotenv": "^8.2.0",
"express": "^4.17.1",
"express-fileupload": "^1.1.7-alpha.1",
"express-session": "^1.17.0",
"express-socket.io-session": "^1.3.5",
"jquery": "^3.4.1",
Expand Down
4 changes: 4 additions & 0 deletions web/css/book-list.css
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@ body {
background-color: darkslategray
}

a {
cursor: pointer;
}

.navbar {
background-color: rgb(88, 131, 131);
color: white;
Expand Down
34 changes: 30 additions & 4 deletions web/css/book.css
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,10 @@ body {
overflow-x: hidden;
}

a {
cursor: pointer;
}

.pointer {
cursor: pointer;
}
Expand Down Expand Up @@ -80,10 +84,6 @@ body {
border-bottom: 1px solid rgb(216, 216, 216);
}

.command-block.running .card-title {
border-bottom: 1px solid rgb(170, 170, 170);
}

.command-block .card-title .dropdown-menu {
font-size: 100%;
}
Expand All @@ -110,6 +110,16 @@ body {
background-color: silver;
}

.command-block.running .card-title {
border-bottom: 1px solid rgb(170, 170, 170);
}

.command-block.running .bloco::after {
content: 'Running...';
color: blue;
margin-left: 5px;
}

.command-block.running .button-remove {
display: none;
}
Expand All @@ -122,6 +132,17 @@ body {
display: none;
}

.command-block.done .bloco::after {
content: 'Done';
color: #0e8913;
margin-left: 5px;
}

.command-block.done .recibo .progress-bar {
background-color: #28a745;
width: 100%!important;
}

.recibo {
font-family: 'Courier New', Courier, monospace;
border: 1px solid gray;
Expand All @@ -138,6 +159,11 @@ body {
color: white
}

.recibo .table td,
.recibo .table th {
border: none
}

button.new-command {
width: 100%;
background-color: rgba(0, 0, 0, 0.13);
Expand Down
8 changes: 2 additions & 6 deletions web/pages/book-list.html
Original file line number Diff line number Diff line change
Expand Up @@ -63,16 +63,12 @@
<i class="fa fa-ellipsis-v"></i>
</span>
<div class="dropdown-menu">
<a class="dropdown-item" href="#" onclick="compartilharScreen('XXX')"data-toggle="modal" data-target="#compartilhar">
<a class="dropdown-item" onclick="compartilharScreen('XXX')"data-toggle="modal" data-target="#compartilhar">
<i class="fa fa-share-alt"></i>
Compartilhar
</a>
<a class="dropdown-item rodar-ate disabled" href="#" onclick="runAllTo(0)">
<i class="fa fa-play"></i>
Rodar todos acima
</a>
<div class="dropdown-divider"></div>
<a class="dropdown-item" href="#" onclick="remover('XXXX')">
<a class="dropdown-item" onclick="remover('XXXX')">
<i class="fa fa-trash"></i>
Remover
</a>
Expand Down
Loading

0 comments on commit db94845

Please sign in to comment.