Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: prepare 2.41.0 #1694

Merged
merged 34 commits into from
Sep 4, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
398514c
chore: CLI upgraded to 2.6.2rc1
jachro Aug 24, 2023
de5cf8d
chore: Update sbt from 1.9.3 to 1.9.4 (#1671)
RenkuBot Aug 25, 2023
c032724
chore: Add query time measurements to sparql client (#1670)
eikek Aug 25, 2023
9c1a6fd
chore: upgrading sbt in dockerfiles
jachro Aug 25, 2023
51f4425
fix: flaky MemoryLockSpec
jachro Aug 25, 2023
334b246
chore: Update rdf4j-queryparser-sparql from 4.3.5 to 4.3.6 (#1672)
RenkuBot Aug 28, 2023
7367c8a
chore: removed grafana dashboards
jachro Aug 28, 2023
05a6ed0
chore: Update scalafmt-core from 3.7.12 to 3.7.13 (#1677)
RenkuBot Aug 29, 2023
66b858a
feat: project auth graph to be provisioned with visibility from GL (#…
jachro Aug 29, 2023
435fc60
fix: slug finding in token creation flow to work when project removed…
jachro Aug 29, 2023
0f36a70
chore: Update sbt-scalafmt from 2.5.0 to 2.5.1 (#1681)
RenkuBot Aug 30, 2023
42d7f1b
chore: Update testcontainers-scala-postgresql, ... from 0.40.17 to 0.…
RenkuBot Aug 30, 2023
64e7d4e
chore: Update fs2-core from 3.8.0 to 3.9.0 (#1679)
RenkuBot Aug 30, 2023
6ce3925
fix: DBUpdater to take SessionResource and handle retries internally
jachro Aug 30, 2023
c1bc204
chore: Project Delete API to log info on successful removal
jachro Aug 30, 2023
3666c67
feat: renku-core client (#1652)
jachro Aug 30, 2023
96f1875
fix: ExecutionTimeRecorder to work when no label for LabeledHistogram…
jachro Aug 30, 2023
907ab6e
chore: Update fs2-core from 3.9.0 to 3.9.1 (#1683)
RenkuBot Aug 31, 2023
ebde622
chore: Update widoco from 1.4.19 to 1.4.20 (#1684)
RenkuBot Aug 31, 2023
76495ae
fix: rollbacktoawaitingdeletion status change to retry on deadlock (#…
jachro Aug 31, 2023
22a9636
chore: Update sbt-scalafmt from 2.5.1 to 2.5.2 (#1687)
RenkuBot Aug 31, 2023
df55e2b
chore: Update circe-core, circe-generic, ... from 0.14.5 to 0.14.6 (#…
RenkuBot Aug 31, 2023
2c8dbcd
chore: CLI upgraded to 2.6.2
jachro Aug 31, 2023
fc167a1
chore: Update wiremock from 2.35.0 to 3.0.0 (#1685)
RenkuBot Aug 31, 2023
705877c
feat: TG project update API to use DB locking (#1689)
jachro Aug 31, 2023
abe2815
fix: flaky RetrySpec
jachro Aug 31, 2023
f957baf
chore: Update wiremock from 3.0.0 to 3.0.1 (#1691)
RenkuBot Sep 1, 2023
7a1cd58
fix: Project Update API to return 403 on 403 from GL
jachro Sep 1, 2023
a92b00e
chore: Update scalafmt-core from 3.7.13 to 3.7.14 (#1693)
RenkuBot Sep 2, 2023
67d3f43
fix: webhook-service not to fail when Project not found in GL (#1692)
jachro Sep 2, 2023
5934dbb
feat: Implement authorizers to use data from ProjectAuth graph (#1688)
eikek Sep 4, 2023
0144b31
chore: tiny improvements to the log statements' in the at tests
jachro Sep 4, 2023
7a81bff
fix: flaky MemoryLockSpec
jachro Sep 4, 2023
c442bd4
Merge branch 'development'
jachro Sep 4, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .scalafmt.conf
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
version = "3.7.12"
version = "3.7.14"

runner.dialect = "scala213"

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -114,9 +114,6 @@ class WebhookValidationEndpointSpec extends AcceptanceSpec with ApplicationServi

Then("he should get NOT_FOUND response back")
afterDeletionResponse.status shouldBe NotFound

And("the Access Token should be removed from the token repository")
tokenRepositoryClient.GET(s"projects/${project.id}/tokens").status shouldBe NotFound
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -60,15 +60,19 @@ object EventLog extends TypeSerializers {
session.prepare(query).flatMap(_.stream(projectId, 32).compile.toList)
}

def findSyncEvents(projectId: GitLabId)(implicit ioRuntime: IORuntime): List[CategoryName] = execute { session =>
val query: Query[projects.GitLabId, CategoryName] = sql"""
def findSyncEventsIO(projectId: GitLabId): IO[List[CategoryName]] =
sessionResource.flatMap(_.session).use { session =>
val query: Query[projects.GitLabId, CategoryName] = sql"""
SELECT category_name
FROM subscription_category_sync_time
WHERE project_id = $projectIdEncoder"""
.query(varchar)
.map(category => CategoryName(category))
session.prepare(query).flatMap(_.stream(projectId, 32).compile.toList)
}
.query(varchar)
.map(category => CategoryName(category))
session.prepare(query).flatMap(_.stream(projectId, 32).compile.toList)
}

def findSyncEvents(projectId: GitLabId)(implicit ioRuntime: IORuntime): List[CategoryName] =
findSyncEventsIO(projectId).unsafeRunSync()

def forceCategoryEventTriggering(categoryName: CategoryName, projectId: projects.GitLabId)(implicit
ioRuntime: IORuntime
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ object PostgresDB {
)

def sessionPoolResource[A](dbCfg: DBConfig[_]): Resource[IO, SessionResource[IO, A]] =
sessionPool(dbCfg).map(new SessionResource[IO, A](_))
sessionPool(dbCfg).map(SessionResource[IO, A](_))

def initializeDatabase(cfg: DBConfig[_]): IO[Unit] = {
val session = Session.single[IO](
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,8 @@ import cats.effect.{IO, Resource, Temporal}
import cats.{Applicative, Monad}
import eu.timepit.refined.auto._
import io.renku.db.DBConfigProvider
import io.renku.graph.model.{RenkuUrl, projects}
import io.renku.projectauth.{ProjectAuthData, QueryFilter}
import io.renku.triplesgenerator.TgLockDB.SessionResource
import io.renku.triplesgenerator.{TgLockDB, TgLockDbConfigProvider}
import io.renku.triplesstore._
Expand Down Expand Up @@ -54,4 +56,10 @@ object TriplesStore extends InMemoryJena with ProjectsDataset with MigrationsDat
private def waitForReadiness(implicit logger: Logger[IO]): IO[Unit] =
Monad[IO].whileM_(IO(!isRunning))(logger.info("Waiting for TS") >> (Temporal[IO] sleep (500 millis)))

def findProjectAuth(
slug: projects.Slug
)(implicit renkuUrl: RenkuUrl, sqtr: SparqlQueryTimeRecorder[IO], L: Logger[IO]): IO[Option[ProjectAuthData]] =
ProjectSparqlClient[IO](projectsDSConnectionInfo)
.map(_.asProjectAuthService)
.use(_.getAll(QueryFilter.all.withSlug(slug)).compile.last)
}
Original file line number Diff line number Diff line change
Expand Up @@ -16,31 +16,36 @@
* limitations under the License.
*/

package io.renku.graph.acceptancetests.flows
package io.renku.graph.acceptancetests
package flows

import cats.Show
import cats.data.NonEmptyList
import cats.effect.IO
import cats.effect.unsafe.IORuntime
import cats.syntax.all._
import fs2.Stream
import io.renku.eventlog.events.producers.membersync.{categoryName => memberSyncCategory}
import io.renku.eventlog.events.producers.minprojectinfo.{categoryName => minProjectInfoCategory}
import io.renku.events.CategoryName
import io.renku.graph.acceptancetests.data
import io.renku.graph.acceptancetests.db.EventLog
import io.renku.graph.acceptancetests.db.{EventLog, TriplesStore}
import io.renku.graph.acceptancetests.testing.AcceptanceTestPatience
import io.renku.graph.acceptancetests.tooling.EventLogClient.ProjectEvent
import io.renku.graph.acceptancetests.tooling.{AcceptanceSpec, ApplicationServices, ModelImplicits}
import io.renku.graph.model.events.{CommitId, EventId, EventStatus, EventStatusProgress}
import io.renku.graph.model.projects
import io.renku.http.client.AccessToken
import io.renku.logging.TestSparqlQueryTimeRecorder
import io.renku.testtools.IOSpec
import io.renku.triplesstore.SparqlQueryTimeRecorder
import io.renku.webhookservice.model.HookToken
import org.http4s.Status._
import org.scalatest.concurrent.Eventually
import org.scalatest.matchers.should
import org.scalatest.{Assertion, EitherValues}
import org.typelevel.log4cats.Logger
import tooling.EventLogClient.ProjectEvent
import tooling.{AcceptanceSpec, ApplicationServices, ModelImplicits}

import java.lang.Thread.sleep
import scala.annotation.tailrec
import scala.concurrent.duration._

trait TSProvisioning
Expand Down Expand Up @@ -77,6 +82,8 @@ trait TSProvisioning
// commitId is the eventId
val condition = commitIds.map(e => EventId(e.value)).toList.map(_ -> EventStatus.TriplesStore)
waitForAllEvents(project.id, condition: _*)
waitForSyncEvents(project.id, memberSyncCategory)
waitForProjectAuthData(project.slug)
}

private def projectEvents(projectId: projects.GitLabId): Stream[IO, List[ProjectEvent]] = {
Expand All @@ -96,12 +103,18 @@ trait TSProvisioning
val expectedResult = expect.toSet
val ids = expect.map(_._1).toSet

implicit val showTuple: Show[(EventId, EventStatus)] =
Show.show { case (id, status) => s"$id:$status" }

implicit val showTuples: Show[Iterable[(EventId, EventStatus)]] =
Show.show(_.toList.mkString_(", "))

val tries =
projectEvents(projectId)
.map(_.filter(ev => ids.contains(ev.id)).map(ev => ev.id -> ev.status).toSet)
.evalTap(result => Logger[IO].debug(s"Wait for event status: $result -> $expectedResult"))
.evalTap(result => Logger[IO].info(show"Waiting for events on $projectId: $result to match $expectedResult"))
.takeThrough(found => found != expectedResult)
.take(13)
.take(15)

val lastValue = tries.compile.lastOrError.unsafeRunSync()
lastValue shouldBe expectedResult
Expand All @@ -111,32 +124,67 @@ trait TSProvisioning
val tries =
projectEvents(projectId)
.map(_.map(ev => EventStatusProgress.Stage(ev.status)).toSet)
.evalTap(stages => Logger[IO].debug(s"Wait for final state: $stages"))
.evalTap(stages =>
Logger[IO].info(show"Waiting for the final processing stage on $projectId, currently: $stages")
)
.takeThrough(stages => stages.exists(_ != EventStatusProgress.Stage.Final))
.take(15)

val lastValue = tries.compile.lastOrError.unsafeRunSync()
lastValue.forall(_ == EventStatusProgress.Stage.Final) shouldBe true
}

def `check hook cannot be found`(projectId: projects.GitLabId, accessToken: AccessToken): Assertion = eventually {
webhookServiceClient.`GET projects/:id/events/status`(projectId, accessToken).status shouldBe NotFound
def getSyncEvents(projectId: projects.GitLabId) = {
val getSyncEvents = EventLog.findSyncEventsIO(projectId)

val waitTimes = Stream.iterate(1d)(_ * 1.5).map(_.seconds).covary[IO].evalMap(IO.sleep)
Stream
.repeatEval(getSyncEvents)
.zip(waitTimes)
.map(_._1)
}

def `wait for the Fast Tract event`(projectId: projects.GitLabId)(implicit ioRuntime: IORuntime): Unit = eventually {
def waitForSyncEvents(projectId: projects.GitLabId, category1: CategoryName, categoryN: CategoryName*) = {
val expected = categoryN.toSet + category1

val sleepTime = 1 second
val tries =
getSyncEvents(projectId)
.evalTap(l => Logger[IO].info(s"Sync events for project $projectId: ${l.mkString(", ")}"))
.takeThrough(evs => expected.intersect(evs.toSet) != expected)
.take(13)

@tailrec
def checkIfWasSent(categoryName: CategoryName, attempt: Int = 1): Unit = {
if (attempt > 20) fail(s"'$categoryName' event wasn't sent after ${(sleepTime * attempt).toSeconds}")
val lastValue = tries.compile.lastOrError.unsafeRunSync()
expected.intersect(lastValue.toSet) shouldBe expected
}

if (!EventLog.findSyncEvents(projectId).contains(categoryName)) {
sleep(sleepTime.toMillis)
checkIfWasSent(categoryName)
}
}
def getProjectAuthData(slug: projects.Slug) = {
implicit val sqtr: SparqlQueryTimeRecorder[IO] = TestSparqlQueryTimeRecorder.createUnsafe
val waitTimes = Stream.iterate(1d)(_ * 1.5).map(_.seconds).covary[IO].evalMap(IO.sleep)

checkIfWasSent(CategoryName("ADD_MIN_PROJECT_INFO"))
Stream
.repeatEval(TriplesStore.findProjectAuth(slug))
.zip(waitTimes)
.map(_._1)
}

def waitForProjectAuthData(slug: projects.Slug) = {
val tries =
getProjectAuthData(slug)
.evalTap {
case None => Logger[IO].info(show"auth data not ready for $slug")
case Some(authData) => Logger[IO].info(show"auth data ready $authData")
}
.takeThrough(_.isEmpty)
.take(15)

val lastValue = tries.compile.lastOrError.unsafeRunSync()
lastValue.isDefined shouldBe true
}

def `check hook cannot be found`(projectId: projects.GitLabId, accessToken: AccessToken): Assertion = eventually {
webhookServiceClient.`GET projects/:id/events/status`(projectId, accessToken).status shouldBe NotFound
}

def `wait for the Fast Tract event`(projectId: projects.GitLabId) =
waitForSyncEvents(projectId, minProjectInfoCategory)
}
Original file line number Diff line number Diff line change
Expand Up @@ -23,17 +23,17 @@ import eu.timepit.refined.auto._
import io.circe.Json
import io.circe.literal._
import io.renku.generators.CommonGraphGenerators.authUsers
import io.renku.generators.Generators._
import io.renku.generators.Generators.Implicits._
import io.renku.generators.Generators._
import io.renku.graph.acceptancetests.data
import io.renku.graph.acceptancetests.data._
import io.renku.graph.acceptancetests.flows.TSProvisioning
import io.renku.graph.acceptancetests.tooling.{AcceptanceSpec, ApplicationServices}
import io.renku.graph.acceptancetests.tooling.TestReadabilityTools._
import io.renku.graph.model._
import io.renku.graph.acceptancetests.tooling.{AcceptanceSpec, ApplicationServices}
import io.renku.graph.model.EventsGenerators.commitIds
import io.renku.graph.model.testentities.{::~, creatorUsernameUpdaterInternal}
import io.renku.graph.model._
import io.renku.graph.model.testentities.generators.EntitiesGenerators._
import io.renku.graph.model.testentities.{::~, creatorUsernameUpdaterInternal}
import io.renku.http.client.UrlEncoder.urlEncode
import io.renku.http.rest.Links.Rel
import io.renku.http.server.EndpointTester._
Expand All @@ -57,23 +57,22 @@ class DatasetsResourcesSpec

Feature("GET knowledge-graph/projects/<namespace>/<name>/datasets to find project's datasets") {

val (dataset1 -> dataset2 -> dataset2Modified, testProject) =
renkuProjectEntities(visibilityPublic, creatorGen = cliShapedPersons)
.modify(removeMembers())
.addDataset(datasetEntities(provenanceInternal(cliShapedPersons)))
.addDatasetAndModification(
datasetEntities(provenanceInternal(cliShapedPersons)),
creatorGen = cliShapedPersons
)
.generateOne
val creatorPerson = cliShapedPersons.generateOne
val project =
dataProjects(testProject)
.map(replaceCreatorFrom(creatorPerson, creator.id))
.map(addMemberFrom(creatorPerson, creator.id) >>> addMemberWithId(user.id))
.generateOne

Scenario("As a user I would like to find project's datasets by calling a REST endpoint") {
val (dataset1 -> dataset2 -> dataset2Modified, testProject) =
renkuProjectEntities(visibilityPublic, creatorGen = cliShapedPersons)
.modify(removeMembers())
.addDataset(datasetEntities(provenanceInternal(cliShapedPersons)))
.addDatasetAndModification(
datasetEntities(provenanceInternal(cliShapedPersons)),
creatorGen = cliShapedPersons
)
.generateOne
val creatorPerson = cliShapedPersons.generateOne
val project =
dataProjects(testProject)
.map(replaceCreatorFrom(creatorPerson, creator.id))
.map(addMemberFrom(creatorPerson, creator.id) >>> addMemberWithId(user.id))
.generateOne

Given("some data in the Triples Store")
gitLabStub.addAuthenticated(creator)
Expand Down
14 changes: 13 additions & 1 deletion build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ lazy val root = project
entitiesViewingsCollector,
projectAuth,
triplesGenerator,
renkuCoreClient,
knowledgeGraph
)

Expand Down Expand Up @@ -107,7 +108,10 @@ lazy val graphCommons = project
.in(file("graph-commons"))
.withId("graph-commons")
.settings(commonSettings)
.dependsOn(renkuModel % "compile->compile; test->test")
.dependsOn(
renkuModel % "compile->compile; test->test",
projectAuth % "compile->compile; test->test"
)
.enablePlugins(AutomateHeaderPlugin)

lazy val eventLogApi = project
Expand Down Expand Up @@ -215,6 +219,13 @@ lazy val tokenRepository = project
AutomateHeaderPlugin
)

lazy val renkuCoreClient = project
.in(file("renku-core-client"))
.withId("renku-core-client")
.settings(commonSettings)
.dependsOn(graphCommons % "compile->compile; test->test")
.enablePlugins(AutomateHeaderPlugin)

lazy val knowledgeGraph = project
.in(file("knowledge-graph"))
.withId("knowledge-graph")
Expand All @@ -231,6 +242,7 @@ lazy val knowledgeGraph = project
graphCommons % "compile->compile; test->test",
entitiesSearch % "compile->compile; test->test",
triplesGeneratorApi % "compile->compile; test->test",
renkuCoreClient % "compile->compile; test->test",
entitiesViewingsCollector
)
.enablePlugins(
Expand Down
2 changes: 1 addition & 1 deletion commit-event-service/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ COPY . .
RUN export PATH="/usr/local/sbt/bin:$PATH" && \
apk update && apk add --no-cache --virtual .build-dependencies bash wget tar git && \
mkdir -p "/usr/local/sbt" && \
wget -qO - "https://github.com/sbt/sbt/releases/download/v1.9.3/sbt-1.9.3.tgz" | tar xz -C /usr/local/sbt --strip-components=1 && \
wget -qO - "https://github.com/sbt/sbt/releases/download/v1.9.4/sbt-1.9.4.tgz" | tar xz -C /usr/local/sbt --strip-components=1 && \
sbt writeVersionToVersionSbt && \
sbt writeVersionToVersionConf && \
sbt "project commit-event-service" stage && \
Expand Down
Loading
Loading