Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Auto deploy feed version #361

Merged
merged 76 commits into from
Jun 15, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
76 commits
Select commit Hold shift + click to select a range
526c126
feat(auto-deploy): auto-deploy to OTP when a new feed version is proc…
landonreed Dec 18, 2020
0f45633
Merge branch 'dev' into auto-deploy
Jan 12, 2021
21919cb
refactor: Initial auto deploy feed version work
Jan 18, 2021
f7bafa6
refactor(FeedVersionTest): Removed unused imports (hangover from test…
Jan 18, 2021
82b7a6d
refactor(Added an additional test): Added a unit test to cover high s…
Jan 19, 2021
ac16806
test: add mock GTFS that expires in 2099
landonreed Jan 19, 2021
3f3e9f9
refactor(Addressed PR feedback): Updated DB call and unit tests
Jan 20, 2021
c37cae0
refactor(AutoDeployFeedJob): Added fail status if job unable to be de…
Jan 20, 2021
e734c03
Merge branch 'dev' into auto-deploy
Jan 21, 2021
6b12c36
refactor(FeedVersion.java): use select distinct for checking error types
landonreed Jan 21, 2021
ce092d6
refactor: provide access to MonitorableJob#subjobs for testing
landonreed Jan 21, 2021
bddfa11
refactor(AutoDeployFeedJobTest): Updated assertions with expected text
Jan 21, 2021
da212c7
refactor(auto-deploy): entirely skip auto deploy job on certain condi…
landonreed Jan 21, 2021
b8d6d91
refactor(Addressed PR feedback): Addressed PR comments
Jan 25, 2021
67fd142
Merge branch 'auto-deploy' of https://github.com/ibi-group/datatools-…
Jan 25, 2021
853862f
refactor(ProcessSingleFeedJob): Added the missing DataManager import
Jan 25, 2021
93400d4
refactor(AutoDeployFeedJobTest): Removed '*' use in imports
Jan 27, 2021
30dde34
refactor(AutoDeployFeedJob): Updated method name
Jan 27, 2021
67cdad4
refactor(AutoDeployFeedJob): Removed unused import
Jan 27, 2021
7e59914
Merge branch 'dev' into auto-deploy
Jan 28, 2021
f3b90b6
refactor(AutoDeployJob): check for active feed fetches before deploying
landonreed Feb 11, 2021
702a5eb
refactor(Auto deploy race conditions unit test): Additional unit test…
Feb 18, 2021
8e27a03
refactor(Addressed PR feedback): Updated how the MonitorableJob messa…
Feb 19, 2021
f172917
refactor(AutoDeployFeedJobTest): Updated base gtfs zip file used to t…
Feb 22, 2021
629ffd2
refactor(AutoDeployFeedJob): Created separate feed sources for each u…
Feb 23, 2021
30e6dcf
refactor(AutoDeployFeedJobTest): Created separate server, project and…
Feb 23, 2021
51f7eda
refactor(AutoDeployFeedJobTest): Fixed incorrect id assignment
Feb 23, 2021
70014be
refactor(AutoDeployFeedJobTest): Reduced the number of feed sources u…
Feb 23, 2021
84586e2
refactor(MonitorableJob): avoid duplicate completion in completeSucce…
landonreed Feb 23, 2021
1d21bf4
refactor(Updated approach to auto deploy): Auto deploy logic now hand…
Mar 17, 2021
135a592
refactor(Fixed merge conflicts): Fixed merge conflicts in DeployJob
Mar 17, 2021
8ba6fc5
Merge branch 'dev' into auto-deploy
Mar 22, 2021
3f9d8b5
refactor(AutoDeployFeedJobTest): Updated junit annotations
Mar 22, 2021
00e451c
Update src/main/java/com/conveyal/datatools/manager/jobs/AutoDeployFe…
Mar 23, 2021
2cb11ea
refactor(Addressed PR feedback): Updated the auto deploy job logic an…
Mar 25, 2021
238d4c6
Update src/test/resources/com/conveyal/datatools/gtfs/fake-agency-wit…
Mar 29, 2021
de1ce5c
Update src/test/java/com/conveyal/datatools/manager/jobs/AutoDeployFe…
Mar 29, 2021
6987ec9
refactor(Addressed PR feedback): Update to focus on deployment feed v…
Mar 29, 2021
801c898
refactor(AutoDeployJob): Updated to use feed sources associated with …
Mar 30, 2021
e5d3a4d
refactor(FeedVersion): Corrected comments
Mar 30, 2021
1e38a4c
refactor: change some logic in AutoDeployJob
evansiroky Mar 31, 2021
9efd934
Merge pull request #372 from ibi-group/auto-deploy-eas
Mar 31, 2021
1a282cf
Update src/main/java/com/conveyal/datatools/manager/jobs/AutoDeployJo…
Mar 31, 2021
1175cdb
Merge branch 'dev' into auto-deploy
Apr 7, 2021
24f1725
refactor(Addressed PR feedback): Addressed PR feedback
Apr 8, 2021
3c3f66b
refactor: add abstract FeedSourceJob and FeedVersionJob classes
landonreed Apr 9, 2021
afaf3bc
refactor(FeedSource.java): Streamlined job instanceof check
Apr 9, 2021
243ef63
Update src/main/java/com/conveyal/datatools/manager/jobs/ProcessSingl…
Apr 9, 2021
413a2b4
Update src/main/java/com/conveyal/datatools/common/status/FeedSourceJ…
Apr 12, 2021
fa3d584
Update src/main/java/com/conveyal/datatools/common/status/FeedVersion…
Apr 12, 2021
6fd03b5
refactor(FeedSource.java): Reverted job in progress check back to a m…
Apr 13, 2021
6cc2657
Merge branch 'auto-deploy' of https://github.com/ibi-group/datatools-…
Apr 13, 2021
38cb1c2
refactor(FeedSource.java): Removed obsolete comments/code
Apr 13, 2021
baad68a
refactor(Added pinned feed versions to Deployment): Added pinned feed…
Apr 15, 2021
cafb0e5
Update src/main/java/com/conveyal/datatools/manager/models/Deployment…
Apr 19, 2021
2aea8ae
refactor(AutoDeployJob.java): Removed notification regarding advancin…
Apr 21, 2021
e65a337
refactor(Addressed PR feedback): Addressed PR feedback
Apr 28, 2021
83d1134
refactor: consolidate AutoDeployJob initialization in ProcessSingleFe…
evansiroky Apr 28, 2021
2d3b82b
Merge pull request #375 from ibi-group/auto-deploy-eas
Apr 30, 2021
0fb54ca
test: fix GisExportJobTest
evansiroky May 2, 2021
d209aca
refactor(auto-deploy): add code to end auto-deployment in certain cases
evansiroky May 3, 2021
5818355
test: fix a failing test
evansiroky May 3, 2021
1c429e8
refactor: move post-deployment auto-deploy logic into method
evansiroky May 4, 2021
3083c5f
Merge pull request #377 from ibi-group/auto-deploy-eas
May 4, 2021
db462d0
refactor(auto-deploy): refactor a few things to improve auto-deployment
evansiroky May 7, 2021
c4d04c7
refactor: address PR review comments
evansiroky May 11, 2021
03e9d2e
Merge pull request #381 from ibi-group/auto-deploy-eas
evansiroky May 11, 2021
0f3f36f
refactor: make sure deployment is persisted in AutoDeployJob
evansiroky May 11, 2021
276aa46
Merge branch 'dev' into auto-deploy
May 21, 2021
2292e72
refactor: add FeedSource#versions
landonreed May 26, 2021
68e678c
refactor: Apply suggestions from code review
landonreed May 27, 2021
0a82db8
refactor(FeedSource): change versions to versionCount
landonreed May 28, 2021
51cca5c
Merge branch 'dev' into auto-deploy
Jun 1, 2021
e88392c
Merge branch 'dev' into auto-deploy
Jun 1, 2021
d196cc4
refactor(Fixed merge conflicts): Fixed merge conflicts
Jun 15, 2021
38c1b43
refactor(HandleCorruptGTFSFileTest.java): Updated test to include pro…
Jun 15, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -270,7 +270,7 @@
<dependency>
<groupId>com.github.conveyal</groupId>
<artifactId>gtfs-lib</artifactId>
<version>6.2.2</version>
<version>6.2.4</version>
<!-- Exclusions added in order to silence SLF4J warnings about multiple bindings:
http://www.slf4j.org/codes.html#multiple_bindings
-->
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
package com.conveyal.datatools.common.status;

import com.conveyal.datatools.manager.auth.Auth0UserProfile;

/**
* This class should be used for any job that operates on a FeedSource.
*/
public abstract class FeedSourceJob extends MonitorableJob {
public FeedSourceJob(Auth0UserProfile owner, String name, JobType type) {
super(owner, name, type);
}

public abstract String getFeedSourceId();
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
package com.conveyal.datatools.common.status;

import com.conveyal.datatools.manager.auth.Auth0UserProfile;

/**
* This class should be used for any job that operates on a FeedVersion.
*/
public abstract class FeedVersionJob extends FeedSourceJob {
public FeedVersionJob(Auth0UserProfile owner, String name, JobType type) {
super(owner, name, type);
}

public abstract String getFeedVersionId();
}
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
package com.conveyal.datatools.common.status;

import com.conveyal.datatools.manager.DataManager;
import com.conveyal.datatools.manager.auth.Auth0UserProfile;
import com.conveyal.datatools.manager.utils.JobUtils;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonProperty;
import org.apache.commons.lang3.exception.ExceptionUtils;
Expand All @@ -18,8 +18,6 @@
import java.util.Set;
import java.util.UUID;

import static com.conveyal.datatools.manager.controllers.api.StatusController.getJobsForUser;

/**
* Created by landon on 6/13/16.
*/
Expand All @@ -31,8 +29,15 @@ public abstract class MonitorableJob implements Runnable, Serializable {
// Public fields will be serialized over HTTP API and visible to the web client
public final JobType type;
public File file;
public String parentJobId;
public JobType parentJobType;

/**
* Whether the job is currently running. This is needed since some jobs can be recurring jobs that won't run until
* their scheduled time and when they finish they could run again.
*/
public boolean active = false;

protected String parentJobId;
protected JobType parentJobType;
// Status is not final to allow some jobs to have extra status fields.
public Status status = new Status();
// Name is not final in case it needs to be amended during job processing.
Expand All @@ -48,6 +53,7 @@ public abstract class MonitorableJob implements Runnable, Serializable {
public List<MonitorableJob> subJobs = new ArrayList<>();

public enum JobType {
AUTO_DEPLOY_FEED_VERSION,
UNKNOWN_TYPE,
ARBITRARY_FEED_TRANSFORM,
BUILD_TRANSPORT_NETWORK,
Expand Down Expand Up @@ -102,9 +108,9 @@ public MonitorableJob () {
private void registerJob() {
// Get all active jobs and add the latest active job. Note: Removal of job from user's set of jobs is handled
// in the StatusController when a user requests their active jobs and the job has finished/errored.
Set<MonitorableJob> userJobs = getJobsForUser(this.owner);
Set<MonitorableJob> userJobs = JobUtils.getJobsForUser(this.owner);
userJobs.add(this);
DataManager.userJobsMap.put(retrieveUserId(), userJobs);
JobUtils.userJobsMap.put(retrieveUserId(), userJobs);
}

@JsonProperty("owner")
Expand All @@ -117,6 +123,11 @@ public String retrieveEmail() {
return this.owner.getEmail();
}

@JsonIgnore @BsonIgnore
public List<MonitorableJob> getSubJobs() {
return subJobs;
}

public File retrieveFile () {
return file;
}
Expand All @@ -140,6 +151,7 @@ public void jobFinished () {
* override jobLogic and jobFinished method(s).
*/
public void run () {
active = true;
boolean parentJobErrored = false;
boolean subTaskErrored = false;
String cancelMessage = "";
Expand Down Expand Up @@ -195,8 +207,10 @@ public void run () {
// could be displayed by the client.
} catch (Exception e) {
status.fail("Job failed due to unhandled exception!", e);
} finally {
LOG.info("{} (jobId={}) {} in {} ms", type, jobId, status.error ? "errored" : "completed", status.duration);
active = false;
}
LOG.info("{} (jobId={}) {} in {} ms", type, jobId, status.error ? "errored" : "completed", status.duration);
}

/**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
import com.conveyal.datatools.manager.models.JsonViews;
import com.conveyal.datatools.manager.models.Snapshot;
import com.conveyal.datatools.manager.persistence.Persistence;
import com.conveyal.datatools.manager.utils.JobUtils;
import com.conveyal.datatools.manager.utils.json.JsonManager;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
Expand Down Expand Up @@ -101,7 +102,7 @@ private static String createSnapshot (Request req, Response res) throws IOExcept
createSnapshotJob.addNextJob(new CreateFeedVersionFromSnapshotJob(feedSource, snapshot, userProfile));
}
// Begin asynchronous execution.
DataManager.heavyExecutor.execute(createSnapshotJob);
JobUtils.heavyExecutor.execute(createSnapshotJob);
return SparkUtils.formatJobMessage(createSnapshotJob.jobId, "Creating snapshot.");
}

Expand All @@ -122,7 +123,7 @@ private static String importFeedVersionAsSnapshot(Request req, Response res) {
boolean preserveBuffer = "true".equals(req.queryParams("preserveBuffer")) && feedSource.editorNamespace != null;
CreateSnapshotJob createSnapshotJob =
new CreateSnapshotJob(userProfile, snapshot, true, false, preserveBuffer);
DataManager.heavyExecutor.execute(createSnapshotJob);
JobUtils.heavyExecutor.execute(createSnapshotJob);
return formatJobMessage(createSnapshotJob.jobId, "Importing version as snapshot.");
}

Expand Down Expand Up @@ -161,7 +162,7 @@ private static String restoreSnapshot (Request req, Response res) {
String name = "Restore snapshot " + snapshotToRestore.name;
Snapshot snapshot = new Snapshot(name, feedSource.id, snapshotToRestore.namespace);
CreateSnapshotJob createSnapshotJob = new CreateSnapshotJob(userProfile, snapshot, true, false, preserveBuffer);
DataManager.heavyExecutor.execute(createSnapshotJob);
JobUtils.heavyExecutor.execute(createSnapshotJob);
return formatJobMessage(createSnapshotJob.jobId, "Restoring snapshot...");
}

Expand All @@ -175,7 +176,7 @@ private static String downloadSnapshotAsGTFS(Request req, Response res) {
// Create and kick off export job.
// FIXME: what if a snapshot is already written to S3?
ExportSnapshotToGTFSJob exportSnapshotToGTFSJob = new ExportSnapshotToGTFSJob(userProfile, snapshot);
DataManager.heavyExecutor.execute(exportSnapshotToGTFSJob);
JobUtils.heavyExecutor.execute(exportSnapshotToGTFSJob);
return formatJobMessage(exportSnapshotToGTFSJob.jobId, "Exporting snapshot to GTFS.");
}

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,19 @@
package com.conveyal.datatools.editor.datastore;

import com.conveyal.datatools.editor.models.transit.*;

import com.conveyal.datatools.editor.models.transit.Agency;
import com.conveyal.datatools.editor.models.transit.AttributeAvailabilityType;
import com.conveyal.datatools.editor.models.transit.EditorFeed;
import com.conveyal.datatools.editor.models.transit.Fare;
import com.conveyal.datatools.editor.models.transit.Route;
import com.conveyal.datatools.editor.models.transit.ScheduleException;
import com.conveyal.datatools.editor.models.transit.ServiceCalendar;
import com.conveyal.datatools.editor.models.transit.StatusType;
import com.conveyal.datatools.editor.models.transit.Stop;
import com.conveyal.datatools.editor.models.transit.StopTime;
import com.conveyal.datatools.editor.models.transit.Trip;
import com.conveyal.datatools.editor.models.transit.TripPattern;
import com.conveyal.datatools.editor.models.transit.TripPatternStop;
import com.conveyal.datatools.editor.utils.GeoUtils;
import com.conveyal.gtfs.GTFSFeed;
import com.conveyal.gtfs.model.CalendarDate;
Expand Down
16 changes: 8 additions & 8 deletions src/main/java/com/conveyal/datatools/manager/ConvertMain.java
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@
import com.conveyal.datatools.editor.jobs.ConvertEditorMapDBToSQL;
import com.conveyal.datatools.editor.models.Snapshot;
import com.conveyal.datatools.manager.controllers.DumpController;
import com.conveyal.datatools.manager.controllers.api.StatusController;
import com.conveyal.datatools.manager.models.FeedSource;
import com.conveyal.datatools.manager.persistence.Persistence;
import com.conveyal.datatools.manager.utils.JobUtils;
import org.apache.commons.io.FileUtils;
import org.mapdb.Fun;
import org.slf4j.Logger;
Expand Down Expand Up @@ -97,15 +97,15 @@ public static void main(String[] args) throws Exception {
// STEP 3A: For each snapshot/editor DB, create a snapshot Mongo object for the feed source with the FeedLoadResult.
migrateEditorFeeds();
LOG.info("Done queueing!!!!!!!!");
int totalJobs = StatusController.getAllJobs().size();
while (!StatusController.filterActiveJobs(StatusController.getAllJobs()).isEmpty()) {
int totalJobs = JobUtils.getAllJobs().size();
while (!JobUtils.filterActiveJobs(JobUtils.getAllJobs()).isEmpty()) {
// While there are still active jobs, continue waiting.
Set<MonitorableJob> activeJobs = StatusController.filterActiveJobs(StatusController.getAllJobs());
Set<MonitorableJob> activeJobs = JobUtils.filterActiveJobs(JobUtils.getAllJobs());
LOG.info(String.format("%d/%d jobs still active. Checking for completion again in 5 seconds...", activeJobs.size(), totalJobs));
// LOG.info(String.join(", ", activeJobs.stream().map(job -> job.name).collect(Collectors.toList())));
int jobsInExecutor = ((ThreadPoolExecutor) DataManager.heavyExecutor).getActiveCount();
int jobsInExecutor = ((ThreadPoolExecutor) JobUtils.heavyExecutor).getActiveCount();
LOG.info(String.format("Jobs in thread pool executor: %d", jobsInExecutor));
LOG.info(String.format("Jobs completed by executor: %d", ((ThreadPoolExecutor) DataManager.heavyExecutor).getCompletedTaskCount()));
LOG.info(String.format("Jobs completed by executor: %d", ((ThreadPoolExecutor) JobUtils.heavyExecutor).getCompletedTaskCount()));
Thread.sleep(5000);
}
long durationInMillis = System.currentTimeMillis() - startTime;
Expand Down Expand Up @@ -141,11 +141,11 @@ public static boolean migrateEditorFeeds (String ...feedIdsToSkip) {
if (!feedSourcesEncountered.contains(feedSource.id)) {
// If this is the first feed encountered, load the editor buffer.
ConvertEditorMapDBToSQL convertEditorBufferToSQL = new ConvertEditorMapDBToSQL(snapshot.id.a, null);
DataManager.heavyExecutor.execute(convertEditorBufferToSQL);
JobUtils.heavyExecutor.execute(convertEditorBufferToSQL);
count++;
}
ConvertEditorMapDBToSQL convertEditorMapDBToSQL = new ConvertEditorMapDBToSQL(snapshot.id.a, snapshot.id.b);
DataManager.heavyExecutor.execute(convertEditorMapDBToSQL);
JobUtils.heavyExecutor.execute(convertEditorMapDBToSQL);
LOG.info(count + "/" + snapshotCount + " snapshot conversion queued");
feedSourcesEncountered.add(feedSource.id);
count++;
Expand Down
12 changes: 0 additions & 12 deletions src/main/java/com/conveyal/datatools/manager/DataManager.java
Original file line number Diff line number Diff line change
Expand Up @@ -87,21 +87,9 @@ public class DataManager {
// TODO: define type for ExternalFeedResource Strings
public static final Map<String, ExternalFeedResource> feedResources = new HashMap<>();

/**
* Stores jobs underway by user ID. NOTE: any set created and stored here must be created with
* {@link Sets#newConcurrentHashSet()} or similar thread-safe Set.
*/
public static Map<String, Set<MonitorableJob>> userJobsMap = new ConcurrentHashMap<>();

// ObjectMapper that loads in YAML config files
private static final ObjectMapper yamlMapper = new ObjectMapper(new YAMLFactory());


// Heavy executor should contain long-lived CPU-intensive tasks (e.g., feed loading/validation)
public static Executor heavyExecutor = Executors.newFixedThreadPool(4);
// light executor is for tasks for things that should finish quickly (e.g., email notifications)
public static Executor lightExecutor = Executors.newSingleThreadExecutor();

public static String repoUrl;
public static String commit = "";

Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
package com.conveyal.datatools.manager.controllers;

import com.conveyal.datatools.common.status.MonitorableJob;
import com.conveyal.datatools.manager.DataManager;
import com.conveyal.datatools.manager.auth.Auth0UserProfile;
import com.conveyal.datatools.manager.jobs.ProcessSingleFeedJob;
import com.conveyal.datatools.manager.jobs.ValidateFeedJob;
Expand All @@ -15,6 +14,7 @@
import com.conveyal.datatools.manager.models.Project;
import com.conveyal.datatools.manager.models.Snapshot;
import com.conveyal.datatools.manager.persistence.Persistence;
import com.conveyal.datatools.manager.utils.JobUtils;
import com.conveyal.datatools.manager.utils.json.JsonManager;
import com.conveyal.gtfs.validator.ValidationResult;
import com.fasterxml.jackson.core.JsonProcessingException;
Expand Down Expand Up @@ -365,7 +365,7 @@ public static boolean validateAll (boolean load, boolean force, String filterFee
} else {
job = new ValidateFeedJob(version, systemUser, false);
}
DataManager.heavyExecutor.execute(job);
JobUtils.heavyExecutor.execute(job);
}
// ValidateAllFeedsJob validateAllFeedsJob = new ValidateAllFeedsJob("system", force, load);
return true;
Expand Down
Loading