Skip to content

Commit

Permalink
Monitor now directly creates current.ongoing instead of waiting for l…
Browse files Browse the repository at this point in the history
…ogs to be appeneded
  • Loading branch information
xavierguihot committed Mar 5, 2018
1 parent 5c6b084 commit 0f3a5ff
Show file tree
Hide file tree
Showing 3 changed files with 16 additions and 7 deletions.
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
## Overview


Version: 1.1.0
Version: 1.1.1

API Scaladoc: [SparkHelper](http://xavierguihot.com/spark_helper/#com.spark_helper.SparkHelper$)

Expand Down Expand Up @@ -126,7 +126,7 @@ assert(DateHelper.nDaysAfterDate(3, "20170307") == "20170310")
### Monitor:

The full list of methods is available at
[scaladoc](http://xavierguihot.com/spark_helper/#com.spark_helper.Monitor$)
[Monitor](http://xavierguihot.com/spark_helper/#com.spark_helper.Monitor$)

It's a simple logger/report which contains a report that one can update from
the driver and a success state. The idea is to persist job executions logs and
Expand Down Expand Up @@ -253,7 +253,7 @@ With sbt, add these lines to your build.sbt:
```scala
resolvers += "jitpack" at "https://jitpack.io"

libraryDependencies += "com.github.xavierguihot" % "spark_helper" % "v1.1.0"
libraryDependencies += "com.github.xavierguihot" % "spark_helper" % "v1.1.1"
```

With maven, add these lines to your pom.xml:
Expand All @@ -269,7 +269,7 @@ With maven, add these lines to your pom.xml:
<dependency>
<groupId>com.github.xavierguihot</groupId>
<artifactId>spark_helper</artifactId>
<version>v1.1.0</version>
<version>v1.1.1</version>
</dependency>
```

Expand All @@ -283,6 +283,6 @@ allprojects {
}
dependencies {
compile 'com.github.xavierguihot:spark_helper:v1.1.0'
compile 'com.github.xavierguihot:spark_helper:v1.1.1'
}
```
2 changes: 1 addition & 1 deletion build.sbt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name := "spark_helper"

version := "1.1.0"
version := "1.1.1"

scalaVersion := "2.11.12"

Expand Down
11 changes: 10 additions & 1 deletion src/main/scala/com/spark_helper/Monitor.scala
Original file line number Diff line number Diff line change
Expand Up @@ -185,6 +185,7 @@ object Monitor {
def setTitle(title: String): Unit = {
reportTitle = Some(title)
reportHeader = buildReportHeader()
storeCurrent()
}

/** Sets the report's contact list.
Expand All @@ -206,6 +207,7 @@ object Monitor {
def addContacts(contacts: List[String]): Unit = {
pointsOfContact = Some(contacts)
reportHeader = buildReportHeader()
storeCurrent()
}

/** Sets the report's description.
Expand All @@ -227,6 +229,7 @@ object Monitor {
def addDescription(description: String): Unit = {
reportDescription = Some(description)
reportHeader = buildReportHeader()
storeCurrent()
}

/** Sets the folder in which logs are stored.
Expand All @@ -250,6 +253,7 @@ object Monitor {
def setLogFolder(logFolder: String): Unit = {
logDirectory = Some(logFolder)
prepareLogFolder()
storeCurrent()
}

/** Activates the purge of logs and sets the purge window.
Expand Down Expand Up @@ -573,6 +577,12 @@ object Monitor {

// And if the logFolder parameter has been set, we also update live the log
// file:
storeCurrent()
}

/** Updates the current stored version of logs in file
* logFolder/current.ongoing */
private def storeCurrent(): Unit =
logDirectory.foreach {
case logFolder => {

Expand All @@ -588,7 +598,6 @@ object Monitor {
HdfsHelper.writeToHdfsFile(ongoingReport, s"$logFolder/current.ongoing")
}
}
}

private def purgeOutdatedLogs(logFolder: String, window: Int): Unit = {

Expand Down

0 comments on commit 0f3a5ff

Please sign in to comment.