Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump the maven group across 3 directories with 1 update #1273

Closed

Bump the maven group across 3 directories with 1 update

00cdfbd
Select commit
Loading
Failed to load commit list.
Closed

Bump the maven group across 3 directories with 1 update #1273

Bump the maven group across 3 directories with 1 update
00cdfbd
Select commit
Loading
Failed to load commit list.
Google Cloud Build / fhir-data-pipes-pr (cloud-build-fhir) timed out Jan 8, 2025 in 2h 1m 13s

Summary

Build Information

Trigger fhir-data-pipes-pr
Build 7d679ba2-39fc-42e7-a5bc-7d90d5e886ba
Start 2025-01-07T21:04:00-08:00
Duration 2h0m18.673s
Status TIMEOUT

Steps

Step Status Duration
Launch HAPI Source Server SUCCESS 27.095s
Launch Sink Server Search SUCCESS 25.417s
Launch Sink Server JDBC SUCCESS 25.277s
Wait for the initial Servers Start SUCCESS 1m8.557s
Compile Bunsen and Pipeline SUCCESS 5m57.507s
Build Uploader Image SUCCESS 22.762s
Run Uploader Unit Tests SUCCESS 1.226s
Build E2E Image SUCCESS 2m33.819s
Upload to HAPI SUCCESS 1m21.504s
Build Pipeline Images SUCCESS 25.307s
Run Batch Pipeline in FHIR-search mode with HAPI source SUCCESS 47.594s
Run E2E Test for FHIR-search mode with HAPI source SUCCESS 8.391s
Run Batch Pipeline for JDBC mode with HAPI source SUCCESS 44.967s
Run E2E Test for JDBC mode with HAPI source SUCCESS 8.459s
Run Batch Pipeline for BULK_EXPORT mode with HAPI source SUCCESS 4m12.059s
Run E2E Test for BULK_EXPORT mode with HAPI source SUCCESS 7.812s
Turn down FHIR Sink Server Search SUCCESS 4.284s
Turn down FHIR Sink Server JDBC SUCCESS 4.769s
Create views database SUCCESS 965ms
Launch HAPI FHIR Sink Server Controller SUCCESS 4.304s
Bring up controller and Spark containers SUCCESS 12m54.404s
Run E2E Test for Dockerized Controller and Spark Thriftserver SUCCESS 2m28.05s
Bring down controller and Spark containers SUCCESS 26.706s
Turn down HAPI Source Server SUCCESS 2.896s
Turn down FHIR Sink Server Controller for e2e tests SUCCESS 4.594s
Launch OpenMRS Server and HAPI FHIR Sink Server for OpenMRS SUCCESS 33.375s
Wait for Servers Start SUCCESS 8m22.235s
Launch Streaming Pipeline CANCELLED 1h39m30.208s
Run E2E Test for STREAMING, using OpenMRS Source QUEUED 0s
Upload to OpenMRS QUEUED 0s
Run Batch Pipeline FHIR-search mode with OpenMRS source QUEUED 0s
Run E2E Test for FHIR-search mode with OpenMRS source QUEUED 0s
Run Batch Pipeline for JDBC mode with OpenMRS source QUEUED 0s
Run E2E Test for JDBC mode with OpenMRS source QUEUED 0s
Test Indicators QUEUED 0s
Turn down Webserver and HAPI Server QUEUED 0s

Details

starting build "7d679ba2-39fc-42e7-a5bc-7d90d5e886ba"

FETCHSOURCE
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: 	git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: 	git branch -m <name>
Initialized empty Git repository in /workspace/.git/
From https://github.com/google/fhir-data-pipes
 * branch            00cdfbd615877d70d8a00cdb6239f04a263b1dd7 -> FETCH_HEAD
Updating files:  44% (471/1063)
Updating files:  45% (479/1063)
Updating files:  46% (489/1063)
Updating files:  47% (500/1063)
Updating files:  48% (511/1063)
Updating files:  49% (521/1063)
Updating files:  49% (524/1063)
Updating files:  50% (532/1063)
Updating files:  51% (543/1063)
Updating files:  52% (553/1063)
Updating files:  53% (564/1063)
Updating files:  54% (575/1063)
Updating files:  55% (585/1063)
Updating files:  56% (596/1063)
Updating files:  57% (606/1063)
Updating files:  57% (608/1063)
Updating files:  58% (617/1063)
Updating files:  59% (628/1063)
Updating files:  60% (638/1063)
Updating files:  61% (649/1063)
Updating files:  62% (660/1063)
Updating files:  63% (670/1063)
Updating files:  64% (681/1063)
Updating files:  65% (691/1063)
Updating files:  66% (702/1063)
Updating files:  67% (713/1063)
Updating files:  68% (723/1063)
Updating files:  69% (734/1063)
Updating files:  70% (745/1063)
Updating files:  71% (755/1063)
Updating files:  72% (766/1063)
Updating files:  73% (776/1063)
Updating files:  74% (787/1063)
Updating files:  75% (798/1063)
Updating files:  76% (808/1063)
Updating files:  77% (819/1063)
Updating files:  78% (830/1063)
Updating files:  79% (840/1063)
Updating files:  80% (851/1063)
Updating files:  81% (862/1063)
Updating files:  82% (872/1063)
Updating files:  83% (883/1063)
Updating files:  84% (893/1063)
Updating files:  85% (904/1063)
Updating files:  86% (915/1063)
Updating files:  87% (925/1063)
Updating files:  88% (936/1063)
Updating files:  89% (947/1063)
Updating files:  90% (957/1063)
Updating files:  91% (968/1063)
Updating files:  92% (978/1063)
Updating files:  93% (989/1063)
Updating files:  94% (1000/1063)
Updating files:  95% (1010/1063)
Updating files:  96% (1021/1063)
Updating files:  97% (1032/1063)
Updating files:  98% (1042/1063)
Updating files:  99% (1053/1063)
Updating files: 100% (1063/1063)
Updating files: 100% (1063/1063), done.
HEAD is now at 00cdfbd Bump the maven group across 3 directories with 1 update
GitCommit:
00cdfbd615877d70d8a00cdb6239f04a263b1dd7
BUILD
Starting Step #0 - "Launch HAPI Source Server"
Starting Step #1 - "Launch Sink Server Search"
Starting Step #2 - "Launch Sink Server JDBC"
Starting Step #7 - "Build E2E Image"
Starting Step #4 - "Compile Bunsen and Pipeline"
Starting Step #5 - "Build Uploader Image"
Step #0 - "Launch HAPI Source Server": Pulling image: docker/compose
Step #1 - "Launch Sink Server Search": Pulling image: docker/compose
Step #2 - "Launch Sink Server JDBC": Pulling image: docker/compose
Step #4 - "Compile Bunsen and Pipeline": Pulling image: maven:3.8.5-openjdk-17
Step #5 - "Build Uploader Image": Already have image (with digest): gcr.io/cloud-builders/docker
Step #7 - "Build E2E Image": Already have image (with digest): gcr.io/cloud-builders/docker
Step #1 - "Launch Sink Server Search": Using default tag: latest
Step #0 - "Launch HAPI Source Server": Using default tag: latest
Step #2 - "Launch Sink Server JDBC": Using default tag: latest
Step #5 - "Build Uploader Image": Sending build context to Docker daemon  1.466MB

Step #5 - "Build Uploader Image": Step 1/10 : FROM python:3.7-slim
Step #7 - "Build E2E Image": Sending build context to Docker daemon  66.43MB

Step #7 - "Build E2E Image": Step 1/14 : FROM maven:3.8.7-eclipse-temurin-17-focal
Step #0 - "Launch HAPI Source Server": latest: Pulling from docker/compose
Step #0 - "Launch HAPI Source Server": aad63a933944: Pulling fs layer
Step #0 - "Launch HAPI Source Server": b396cd7cbac4: Pulling fs layer
Step #0 - "Launch HAPI Source Server": 0426ec0ed60a: Pulling fs layer
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Pulling fs layer
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Waiting
Step #0 - "Launch HAPI Source Server": b396cd7cbac4: Verifying Checksum
Step #0 - "Launch HAPI Source Server": b396cd7cbac4: Download complete
Step #0 - "Launch HAPI Source Server": aad63a933944: Verifying Checksum
Step #0 - "Launch HAPI Source Server": aad63a933944: Download complete
Step #0 - "Launch HAPI Source Server": aad63a933944: Pull complete
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Verifying Checksum
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Download complete
Step #0 - "Launch HAPI Source Server": b396cd7cbac4: Pull complete
Step #0 - "Launch HAPI Source Server": 0426ec0ed60a: Verifying Checksum
Step #0 - "Launch HAPI Source Server": 0426ec0ed60a: Download complete
Step #4 - "Compile Bunsen and Pipeline": 3.8.5-openjdk-17: Pulling from library/maven
Step #1 - "Launch Sink Server Search": latest: Pulling from docker/compose
Step #1 - "Launch Sink Server Search": aad63a933944: Already exists
Step #1 - "Launch Sink Server Search": b396cd7cbac4: Already exists
Step #1 - "Launch Sink Server Search": 0426ec0ed60a: Pulling fs layer
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Pulling fs layer
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Download complete
Step #2 - "Launch Sink Server JDBC": latest: Pulling from docker/compose
Step #2 - "Launch Sink Server JDBC": aad63a933944: Already exists
Step #5 - "Build Uploader Image": 3.7-slim: Pulling from library/python
Step #2 - "Launch Sink Server JDBC": b396cd7cbac4: Already exists
Step #2 - "Launch Sink Server JDBC": 0426ec0ed60a: Pulling fs layer
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Pulling fs layer
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Download complete
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Pulling fs layer
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Waiting
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Waiting
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Waiting
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Waiting
Step #5 - "Build Uploader Image": a803e7c4b030: Pulling fs layer
Step #5 - "Build Uploader Image": bf3336e84c8e: Pulling fs layer
Step #5 - "Build Uploader Image": 8973eb85275f: Pulling fs layer
Step #5 - "Build Uploader Image": f9afc3cc0135: Pulling fs layer
Step #5 - "Build Uploader Image": a803e7c4b030: Waiting
Step #5 - "Build Uploader Image": 39312d8b4ab7: Pulling fs layer
Step #5 - "Build Uploader Image": bf3336e84c8e: Waiting
Step #5 - "Build Uploader Image": 8973eb85275f: Waiting
Step #5 - "Build Uploader Image": 39312d8b4ab7: Waiting
Step #5 - "Build Uploader Image": f9afc3cc0135: Waiting
Step #7 - "Build E2E Image": 3.8.7-eclipse-temurin-17-focal: Pulling from library/maven
Step #2 - "Launch Sink Server JDBC": 0426ec0ed60a: Pull complete
Step #1 - "Launch Sink Server Search": 0426ec0ed60a: Pull complete
Step #0 - "Launch HAPI Source Server": 0426ec0ed60a: Pull complete
Step #7 - "Build E2E Image": 7608715873ec: Pulling fs layer
Step #7 - "Build E2E Image": 64a0b7566174: Pulling fs layer
Step #7 - "Build E2E Image": 414e25888ba9: Pulling fs layer
Step #7 - "Build E2E Image": fa1796814410: Pulling fs layer
Step #7 - "Build E2E Image": dc3ab4515b24: Pulling fs layer
Step #7 - "Build E2E Image": 495d1ae42cb9: Pulling fs layer
Step #7 - "Build E2E Image": 66b6d86e5b33: Pulling fs layer
Step #7 - "Build E2E Image": 90062ecd5dec: Pulling fs layer
Step #7 - "Build E2E Image": fa1796814410: Waiting
Step #7 - "Build E2E Image": 7608715873ec: Waiting
Step #7 - "Build E2E Image": dc3ab4515b24: Waiting
Step #7 - "Build E2E Image": 414e25888ba9: Waiting
Step #7 - "Build E2E Image": 90062ecd5dec: Waiting
Step #7 - "Build E2E Image": 495d1ae42cb9: Waiting
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Download complete
Step #0 - "Launch HAPI Source Server": 9ac2a98ece5b: Pull complete
Step #1 - "Launch Sink Server Search": 9ac2a98ece5b: Pull complete
Step #2 - "Launch Sink Server JDBC": 9ac2a98ece5b: Pull complete
Step #2 - "Launch Sink Server JDBC": Digest: sha256:b60a020c0f68047b353a4a747f27f5e5ddb17116b7b018762edfb6f7a6439a82
Step #0 - "Launch HAPI Source Server": Digest: sha256:b60a020c0f68047b353a4a747f27f5e5ddb17116b7b018762edfb6f7a6439a82
Step #1 - "Launch Sink Server Search": Digest: sha256:b60a020c0f68047b353a4a747f27f5e5ddb17116b7b018762edfb6f7a6439a82
Step #2 - "Launch Sink Server JDBC": Status: Downloaded newer image for docker/compose:latest
Step #0 - "Launch HAPI Source Server": Status: Downloaded newer image for docker/compose:latest
Step #1 - "Launch Sink Server Search": Status: Image is up to date for docker/compose:latest
Step #1 - "Launch Sink Server Search": docker.io/docker/compose:latest
Step #2 - "Launch Sink Server JDBC": docker.io/docker/compose:latest
Step #0 - "Launch HAPI Source Server": docker.io/docker/compose:latest
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Download complete
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Download complete
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Download complete
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Download complete
Step #5 - "Build Uploader Image": a803e7c4b030: Verifying Checksum
Step #5 - "Build Uploader Image": a803e7c4b030: Download complete
Step #5 - "Build Uploader Image": bf3336e84c8e: Download complete
Step #5 - "Build Uploader Image": 8973eb85275f: Verifying Checksum
Step #5 - "Build Uploader Image": 8973eb85275f: Download complete
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Download complete
Step #5 - "Build Uploader Image": f9afc3cc0135: Verifying Checksum
Step #5 - "Build Uploader Image": f9afc3cc0135: Download complete
Step #5 - "Build Uploader Image": 39312d8b4ab7: Verifying Checksum
Step #5 - "Build Uploader Image": 39312d8b4ab7: Download complete
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Verifying Checksum
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Download complete
Step #4 - "Compile Bunsen and Pipeline": 38a980f2cc8a: Pull complete
Step #7 - "Build E2E Image": 7608715873ec: Verifying Checksum
Step #7 - "Build E2E Image": 7608715873ec: Download complete
Step #7 - "Build E2E Image": 64a0b7566174: Verifying Checksum
Step #7 - "Build E2E Image": 64a0b7566174: Download complete
Step #7 - "Build E2E Image": fa1796814410: Verifying Checksum
Step #7 - "Build E2E Image": fa1796814410: Download complete
Step #7 - "Build E2E Image": 495d1ae42cb9: Verifying Checksum
Step #7 - "Build E2E Image": 495d1ae42cb9: Download complete
Step #7 - "Build E2E Image": 66b6d86e5b33: Verifying Checksum
Step #7 - "Build E2E Image": 66b6d86e5b33: Download complete
Step #7 - "Build E2E Image": dc3ab4515b24: Verifying Checksum
Step #7 - "Build E2E Image": dc3ab4515b24: Download complete
Step #5 - "Build Uploader Image": a803e7c4b030: Pull complete
Step #4 - "Compile Bunsen and Pipeline": de849f1cfbe6: Pull complete
Step #7 - "Build E2E Image": 90062ecd5dec: Download complete
Step #5 - "Build Uploader Image": bf3336e84c8e: Pull complete
Step #7 - "Build E2E Image": 7608715873ec: Pull complete
Step #7 - "Build E2E Image": 414e25888ba9: Verifying Checksum
Step #7 - "Build E2E Image": 414e25888ba9: Download complete
Step #5 - "Build Uploader Image": 8973eb85275f: Pull complete
Step #5 - "Build Uploader Image": f9afc3cc0135: Pull complete
Step #7 - "Build E2E Image": 64a0b7566174: Pull complete
Step #5 - "Build Uploader Image": 39312d8b4ab7: Pull complete
Step #5 - "Build Uploader Image": Digest: sha256:b53f496ca43e5af6994f8e316cf03af31050bf7944e0e4a308ad86c001cf028b
Step #5 - "Build Uploader Image": Status: Downloaded newer image for python:3.7-slim
Step #5 - "Build Uploader Image":  ---> a255ffcb469f
Step #5 - "Build Uploader Image": Step 2/10 : WORKDIR /uploader
Step #4 - "Compile Bunsen and Pipeline": a7203ca35e75: Pull complete
Step #5 - "Build Uploader Image":  ---> Running in 7059f37aa7b7
Step #5 - "Build Uploader Image": Removing intermediate container 7059f37aa7b7
Step #5 - "Build Uploader Image":  ---> 0cf974d893bb
Step #5 - "Build Uploader Image": Step 3/10 : COPY  ./ ./
Step #5 - "Build Uploader Image":  ---> b1721800fe29
Step #5 - "Build Uploader Image": Step 4/10 : RUN pip install -r requirements.txt
Step #5 - "Build Uploader Image":  ---> Running in 53a6867f0872
Step #7 - "Build E2E Image": 414e25888ba9: Pull complete
Step #7 - "Build E2E Image": fa1796814410: Pull complete
Step #1 - "Launch Sink Server Search": Creating volume "sink-server-search_hapi-data" with default driver
Step #2 - "Launch Sink Server JDBC": Creating volume "sink-server-jdbc_hapi-data" with default driver
Step #7 - "Build E2E Image": dc3ab4515b24: Pull complete
Step #0 - "Launch HAPI Source Server": Creating network "hapi-compose_default" with the default driver
Step #4 - "Compile Bunsen and Pipeline": 3337662e6dc9: Pull complete
Step #7 - "Build E2E Image": 495d1ae42cb9: Pull complete
Step #7 - "Build E2E Image": 66b6d86e5b33: Pull complete
Step #4 - "Compile Bunsen and Pipeline": 81485058ab89: Pull complete
Step #7 - "Build E2E Image": 90062ecd5dec: Pull complete
Step #0 - "Launch HAPI Source Server": Creating volume "hapi-compose_hapi-fhir-db" with default driver
Step #7 - "Build E2E Image": Digest: sha256:ad4b34f02e52164df83182a2a05074b5288d6e6bcc2dfa0ce3d6fa43ec8b557f
Step #7 - "Build E2E Image": Status: Downloaded newer image for maven:3.8.7-eclipse-temurin-17-focal
Step #7 - "Build E2E Image":  ---> 896b49b4d0b7
Step #7 - "Build E2E Image": Step 2/14 : RUN apt-get update && apt-get install -y jq  python3.8 python3-pip
Step #4 - "Compile Bunsen and Pipeline": b548970362bb: Pull complete
Step #2 - "Launch Sink Server JDBC": Pulling sink-server (hapiproject/hapi:latest)...
Step #1 - "Launch Sink Server Search": Pulling sink-server (hapiproject/hapi:latest)...
Step #0 - "Launch HAPI Source Server": Creating volume "hapi-compose_hapi-server" with default driver
Step #4 - "Compile Bunsen and Pipeline": dbd02ad382f5: Pull complete
Step #4 - "Compile Bunsen and Pipeline": Digest: sha256:3a9c30b3af6278a8ae0007d3a3bf00fff80ec3ed7ae4eb9bfa1772853101549b
Step #4 - "Compile Bunsen and Pipeline": Status: Downloaded newer image for maven:3.8.5-openjdk-17
Step #4 - "Compile Bunsen and Pipeline": docker.io/library/maven:3.8.5-openjdk-17
Step #5 - "Build Uploader Image": Collecting google-auth
Step #5 - "Build Uploader Image":   Downloading google_auth-2.37.0-py2.py3-none-any.whl (209 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 209.8/209.8 kB 1.4 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting mock
Step #5 - "Build Uploader Image":   Downloading mock-5.1.0-py3-none-any.whl (30 kB)
Step #0 - "Launch HAPI Source Server": Pulling db (postgres:)...
Step #2 - "Launch Sink Server JDBC": latest: Pulling from hapiproject/hapi
Step #5 - "Build Uploader Image": Collecting requests
Step #1 - "Launch Sink Server Search": latest: Pulling from hapiproject/hapi
Step #5 - "Build Uploader Image":   Downloading requests-2.31.0-py3-none-any.whl (62 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.6/62.6 kB 6.8 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting rsa<5,>=3.1.4
Step #5 - "Build Uploader Image":   Downloading rsa-4.9-py3-none-any.whl (34 kB)
Step #5 - "Build Uploader Image": Collecting pyasn1-modules>=0.2.1
Step #5 - "Build Uploader Image":   Downloading pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 181.3/181.3 kB 5.9 MB/s eta 0:00:00
Step #7 - "Build E2E Image":  ---> Running in ab99cb1f74e2
Step #5 - "Build Uploader Image": Collecting cachetools<6.0,>=2.0.0
Step #5 - "Build Uploader Image":   Downloading cachetools-5.5.0-py3-none-any.whl (9.5 kB)
Step #5 - "Build Uploader Image": Collecting certifi>=2017.4.17
Step #5 - "Build Uploader Image":   Downloading certifi-2024.12.14-py3-none-any.whl (164 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 164.9/164.9 kB 8.5 MB/s eta 0:00:00
Step #0 - "Launch HAPI Source Server": latest: Pulling from library/postgres
Step #5 - "Build Uploader Image": Collecting charset-normalizer<4,>=2
Step #5 - "Build Uploader Image":   Downloading charset_normalizer-3.4.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (138 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 138.6/138.6 kB 8.6 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting idna<4,>=2.5
Step #5 - "Build Uploader Image":   Downloading idna-3.10-py3-none-any.whl (70 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 70.4/70.4 kB 7.6 MB/s eta 0:00:00
Step #7 - "Build E2E Image": Get:1 http://security.ubuntu.com/ubuntu focal-security InRelease [128 kB]
Step #5 - "Build Uploader Image": Collecting urllib3<3,>=1.21.1
Step #5 - "Build Uploader Image":   Downloading urllib3-2.0.7-py3-none-any.whl (124 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 124.2/124.2 kB 8.5 MB/s eta 0:00:00
Step #5 - "Build Uploader Image": Collecting pyasn1<0.6.0,>=0.4.6
Step #5 - "Build Uploader Image":   Downloading pyasn1-0.5.1-py2.py3-none-any.whl (84 kB)
Step #5 - "Build Uploader Image":      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 84.9/84.9 kB 6.0 MB/s eta 0:00:00
Step #7 - "Build E2E Image": Get:2 http://archive.ubuntu.com/ubuntu focal InRelease [265 kB]
Step #7 - "Build E2E Image": Get:3 http://security.ubuntu.com/ubuntu focal-security/universe amd64 Packages [1,296 kB]
Step #5 - "Build Uploader Image": Installing collected packages: urllib3, pyasn1, mock, idna, charset-normalizer, certifi, cachetools, rsa, requests, pyasn1-modules, google-auth
Step #7 - "Build E2E Image": Get:4 http://security.ubuntu.com/ubuntu focal-security/restricted amd64 Packages [4,259 kB]
Step #7 - "Build E2E Image": Get:5 http://security.ubuntu.com/ubuntu focal-security/main amd64 Packages [4,175 kB]
Step #7 - "Build E2E Image": Get:6 http://security.ubuntu.com/ubuntu focal-security/multiverse amd64 Packages [30.9 kB]
Step #7 - "Build E2E Image": Get:7 http://archive.ubuntu.com/ubuntu focal-updates InRelease [128 kB]
Step #7 - "Build E2E Image": Get:8 http://archive.ubuntu.com/ubuntu focal-backports InRelease [128 kB]
Step #7 - "Build E2E Image": Get:9 http://archive.ubuntu.com/ubuntu focal/universe amd64 Packages [11.3 MB]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Error stacktraces are turned on.
Step #4 - "Compile Bunsen and Pipeline": [INFO] Scanning for projects...
Step #5 - "Build Uploader Image": Successfully installed cachetools-5.5.0 certifi-2024.12.14 charset-normalizer-3.4.1 google-auth-2.37.0 idna-3.10 mock-5.1.0 pyasn1-0.5.1 pyasn1-modules-0.3.0 requests-2.31.0 rsa-4.9 urllib3-2.0.7
Step #5 - "Build Uploader Image": �[91mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Step #7 - "Build E2E Image": Get:10 http://archive.ubuntu.com/ubuntu focal/main amd64 Packages [1,275 kB]
Step #5 - "Build Uploader Image": �[0m�[91m
Step #5 - "Build Uploader Image": [notice] A new release of pip is available: 23.0.1 -> 24.0
Step #5 - "Build Uploader Image": [notice] To update, run: pip install --upgrade pip
Step #7 - "Build E2E Image": Get:11 http://archive.ubuntu.com/ubuntu focal/multiverse amd64 Packages [177 kB]
Step #7 - "Build E2E Image": Get:12 http://archive.ubuntu.com/ubuntu focal/restricted amd64 Packages [33.4 kB]
Step #7 - "Build E2E Image": Get:13 http://archive.ubuntu.com/ubuntu focal-updates/universe amd64 Packages [1,587 kB]
Step #7 - "Build E2E Image": Get:14 http://archive.ubuntu.com/ubuntu focal-updates/multiverse amd64 Packages [34.6 kB]
Step #7 - "Build E2E Image": Get:15 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 Packages [4,653 kB]
Step #7 - "Build E2E Image": Get:16 http://archive.ubuntu.com/ubuntu focal-updates/restricted amd64 Packages [4,456 kB]
Step #7 - "Build E2E Image": Get:17 http://archive.ubuntu.com/ubuntu focal-backports/main amd64 Packages [55.2 kB]
Step #7 - "Build E2E Image": Get:18 http://archive.ubuntu.com/ubuntu focal-backports/universe amd64 Packages [28.6 kB]
Step #7 - "Build E2E Image": Fetched 34.1 MB in 4s (7,950 kB/s)
Step #7 - "Build E2E Image": Reading package lists...
Step #5 - "Build Uploader Image": �[0mRemoving intermediate container 53a6867f0872
Step #5 - "Build Uploader Image":  ---> efc07637b662
Step #5 - "Build Uploader Image": Step 5/10 : ENV INPUT_DIR="./test_files"
Step #4 - "Compile Bunsen and Pipeline": [INFO] ------------------------------------------------------------------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] Reactor Build Order:
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] root                                                               [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Parent                                                      [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Extension Structure Definitions                                    [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core                                                        [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core R4                                                     [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Core Stu3                                                   [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] Bunsen Avro                                                        [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] FHIR Analytics                                                     [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] common                                                             [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] batch                                                              [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] streaming                                                          [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] controller                                                         [jar]
Step #4 - "Compile Bunsen and Pipeline": [INFO] coverage                                                           [pom]
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] Using the MultiThreadedBuilder implementation with a thread count of 32
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] -------------------< com.google.fhir.analytics:root >-------------------
Step #4 - "Compile Bunsen and Pipeline": [INFO] Building root 0.2.7-SNAPSHOT                                      [1/13]
Step #4 - "Compile Bunsen and Pipeline": [INFO] --------------------------------[ pom ]---------------------------------
Step #5 - "Build Uploader Image":  ---> Running in de2911a677f9
Step #5 - "Build Uploader Image": Removing intermediate container de2911a677f9
Step #5 - "Build Uploader Image":  ---> 451a2546bdda
Step #5 - "Build Uploader Image": Step 6/10 : ENV CORES=""
Step #4 - "Compile Bunsen and Pipeline": [INFO] 
Step #4 - "Compile Bunsen and Pipeline": [INFO] --- jacoco-maven-plugin:0.8.12:prepare-agent (default) @ root ---
Step #5 - "Build Uploader Image":  ---> Running in e2562a8ae9d5
Step #0 - "Launch HAPI Source Server": Digest: sha256:888402a8cd6075c5dc83a31f58287f13306c318eaad016661ed12e076f3e6341
Step #0 - "Launch HAPI Source Server": Status: Downloaded newer image for postgres:latest
Step #0 - "Launch HAPI Source Server": Pulling hapi-server (hapiproject/hapi:latest)...
Step #5 - "Build Uploader Image": Removing intermediate container e2562a8ae9d5
Step #5 - "Build Uploader Image":  ---> 35ba683e65be
Step #5 - "Build Uploader Image": Step 7/10 : ENV CONVERT=""
Step #5 - "Build Uploader Image":  ---> Running in 0e84efdec28c
Step #0 - "Launch HAPI Source Server": latest: Pulling from hapiproject/hapi
Step #5 - "Build Uploader Image": Removing intermediate container 0e84efdec28c
Step #5 - "Build Uploader Image":  ---> 7ecaa02efd6c
Step #5 - "Build Uploader Image": Step 8/10 : ENV SINK_TYPE="HAPI"
Step #7 - "Build E2E Image": Reading package lists...
Step #5 - "Build Uploader Image":  ---> Running in c0d18bde45c0
Step #7 - "Build E2E Image": Building dependency tree...
Step #7 - "Build E2E Image": Reading state information...
Step #5 - "Build Uploader Image": Removing intermediate container c0d18bde45c0
Step #5 - "Build Uploader Image":  ---> 3a7adc2cd5c9
Step #5 - "Build Uploader Image": Step 9/10 : ENV FHIR_ENDPOINT="http://localhost:8098/fhir"
Step #5 - "Build Uploader Image":  ---> Running in 215662bf8202
Step #7 - "Build E2E Image": The following additional packages will be installed:
Step #7 - "Build E2E Image":   build-essential cpp cpp-9 dirmngr dpkg-dev fakeroot file g++ g++-9 gcc
Step #7 - "Build E2E Image":   gcc-10-base gcc-9 gcc-9-base gnupg gnupg-l10n gnupg-utils gpg gpg-agent
Step #7 - "Build E2E Image":   gpg-wks-client gpg-wks-server gpgconf gpgsm libalgorithm-diff-perl
Step #7 - "Build E2E Image":   libalgorithm-diff-xs-perl libalgorithm-merge-perl libasan5 libassuan0
Step #7 - "Build E2E Image":   libatomic1 libc-dev-bin libc6 libc6-dev libcc1-0 libcrypt-dev libdpkg-perl
Step #7 - "Build E2E Image":   libexpat1 libexpat1-dev libfakeroot libfile-fcntllock-perl libgcc-9-dev
Step #7 - "Build E2E Image":   libgcc-s1 libgomp1 libisl22 libitm1 libjq1 libksba8 liblocale-gettext-perl
Step #7 - "Build E2E Image":   liblsan0 libmagic-mgc libmagic1 libmpc3 libmpdec2 libmpfr6 libnpth0 libonig5
Step #7 - "Build E2E Image":   libpython3-dev libpython3-stdlib libpython3.8 libpython3.8-dev
Step #7 - "Build E2E Image":   libpython3.8-minimal libpython3.8-stdlib libquadmath0 libreadline8
Step #7 - "Build E2E Image":   libstdc++-9-dev libstdc++6 libtsan0 libubsan1 linux-libc-dev make manpages
Step #7 - "Build E2E Image":   manpages-dev mime-support pinentry-curses python-pip-whl python3 python3-dev
Step #7 - "Build E2E Image":   python3-distutils python3-lib2to3 python3-minimal python3-pkg-resources
Step #7 - "Build E2E Image":   python3-setuptools python3-wheel python3.8-dev python3.8-minimal
Step #7 - "Build E2E Image":   readline-common xz-utils zlib1g-dev
Step #7 - "Build E2E Image": Suggested packages:
Step #7 - "Build E2E Image":   cpp-doc gcc-9-locales dbus-user-session libpam-systemd pinentry-gnome3 tor
Step #7 - "Build E2E Image":   debian-keyring g++-multilib g++-9-multilib gcc-9-doc gcc-multilib autoconf
Step #7 - "Build E2E Image":   automake libtool flex bison gdb gcc-doc gcc-9-multilib parcimonie xloadimage
Step #7 - "Build E2E Image":   scdaemon glibc-doc bzr libstdc++-9-doc make-doc man-browser pinentry-doc
Step #7 - "Build E2E Image":   python3-doc python3-tk python3-venv python-setuptools-doc python3.8-venv
Step #7 - "Build E2E Image":   python3.8-doc binfmt-support readline-doc
Step #5 - "Build Uploader Image": Removing intermediate container 215662bf8202
Step #5 - "Build Uploader Image":  ---> d18f1562732e
Step #5 - "Build Uploader Image": Step 10/10 : CMD cd /uploader; python main.py ${SINK_TYPE}     ${FHIR_ENDPOINT} --input_dir ${INPUT_DIR} ${CORES} ${CONVERT}
Step #5 - "Build Uploader Image":  ---> Running in ca0f04fb6cd9
Step #7 - "Build E2E Image": The following NEW packages will be installed:
Step #7 - "Build E2E Image":   build-essential cpp cpp-9 dirmngr dpkg-dev fakeroot file g++ g++-9 gcc gcc-9
Step #7 - "Build E2E Image":   gcc-9-base gnupg gnupg-l10n gnupg-utils gpg gpg-agent gpg-wks-client
Step #7 - "Build E2E Image":   gpg-wks-server gpgconf gpgsm jq libalgorithm-diff-perl
Step #7 - "Build E2E Image":   libalgorithm-diff-xs-perl libalgorithm-merge-perl libasan5 libassuan0
Step #7 - "Build E2E Image":   libatomic1 libc-dev-bin libc6-dev libcc1-0 libcrypt-dev libdpkg-perl
Step #7 - "Build E2E Image":   libexpat1-dev libfakeroot libfile-fcntllock-perl libgcc-9-dev libgomp1
Step #7 - "Build E2E Image":   libisl22 libitm1 libjq1 libksba8 liblocale-gettext-perl liblsan0
Step #7 - "Build E2E Image":   libmagic-mgc libmagic1 libmpc3 libmpdec2 libmpfr6 libnpth0 libonig5
Step #7 - "Build E2E Image":   libpython3-dev libpython3-stdlib libpython3.8 libpython3.8-dev
Step #7 - "Build E2E Image":   libpython3.8-minimal libpython3.8-stdlib libquadmath0 libreadline8
Step #7 - "Build E2E Image":   libstdc++-9-dev libtsan0 libubsan1 linux-libc-dev make manpages manpages-dev
Step #7 - "Build E2E Image":   mime-support pinentry-curses python-pip-whl python3 python3-dev
Step #7 - "Build E2E Image":   python3-distutils python3-lib2to3 python3-minimal python3-pip
Step #7 - "Build E2E Image":   python3-pkg-resources python3-setuptools python3-wheel python3.8
Step #7 - "Build E2E Image":   python3.8-dev python3.8-minimal readline-common xz-utils zlib1g-dev
Step #5 - "Build Uploader Image": Removing intermediate container ca0f04fb6cd9
Step #5 - "Build Uploader Image":  ---> fdba814e7c86
Step #7 - "Build E2E Image": The following packages will be upgraded:
Step #7 - "Build E2E Image":   gcc-10-base libc6 libexpat1 libgcc-s1 libstdc++6
Step #5 - "Build Uploader Image": Successfully built fdba814e7c86
Step #5 - "Build Uploader Image": Successfully tagged us-docker.pkg.dev/cloud-build-fhir/fhir-analytics/synthea-uploader:00cdfbd
Step #4 - "Compile Bunsen and Pipeline": [INFO] argLine set to -javaagent:/root/.m2/repository/org/jacoco/org.jacoco.agent/0.8.12/org.jacoco.agent-0.8.12-runtime.jar=destfile=/workspace/target/jacoco.exec
Step #4 - "Compile Bun
...
[Logs truncated due to log size limitations. For full logs, see https://storage.cloud.google.com/cloud-build-gh-logs/log-7d679ba2-39fc-42e7-a5bc-7d90d5e886ba.txt.]
...
on
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Accept: */*
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > 
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < HTTP/1.1 200 
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Content-Type: application/json
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Transfer-Encoding: chunked
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Date: Wed, 08 Jan 2025 05:26:47 GMT
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < 
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": { [44 bytes data]
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
100    38    0    38    0     0   7628      0 --:--:-- --:--:-- --:--:--  9500
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connection #0 to host pipeline-controller left intact
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": SUCCESSE2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients: 159
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters: 8012
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations: 34558
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patient flat rows: 108
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounter flat rows: 8012
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observation flat rows: 34558
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Pipeline transformation successfully completed.
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Finding number of patients, encounters and obs in FHIR server http://hapi-server:8080
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":                                  Dload  Upload   Total   Spent    Left  Speed
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   364    0   364    0     0  27341      0 --:--:-- --:--:-- --:--:-- 28000
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": {
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "resourceType": "Bundle",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "id": "fba26a78-06c2-469a-873a-8946ecc18b52",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "meta": {
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "lastUpdated": "2025-01-08T05:26:56.591+00:00",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "tag": [ {
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "system": "http://terminology.hl7.org/CodeSystem/v3-ObservationValue",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "code": "SUBSETTED",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "display": "Resource encoded in summary mode"
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     } ]
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   },
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "type": "searchset",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "total": 80
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": }E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Write Patients into File
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Write Observation into File
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Before count
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total FHIR source test patients ---> 80
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total FHIR source test encounters ---> 4006
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total FHIR source test obs ---> 17279
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Finding number of patients, encounters and obs in FHIR server http://sink-server-controller:8080
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":                                  Dload  Upload   Total   Spent    Left  Speed
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   364    0   364    0     0  33731      0 --:--:-- --:--:-- --:--:-- 36400
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": {
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "resourceType": "Bundle",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "id": "7ed3eef3-5122-4eb4-bdd7-c0a0ee919f67",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "meta": {
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "lastUpdated": "2025-01-08T05:26:56.829+00:00",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     "tag": [ {
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "system": "http://terminology.hl7.org/CodeSystem/v3-ObservationValue",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "code": "SUBSETTED",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":       "display": "Resource encoded in summary mode"
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":     } ]
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   },
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "type": "searchset",
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   "total": 80
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": }E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Write Patients into File
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Write Observation into File
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Counting number of patients, encounters and obs sinked to fhir files
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total patients sinked to fhir ---> 80
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total encounters sinked to fhir ---> 4006
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Total observations sinked to fhir ---> 17279
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: FHIR SERVER SINK EXECUTED SUCCESSFULLY USING INCREMENTAL MODE
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Connecting to jdbc:hive2://spark:10000
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Connected to: Spark SQL (version 3.3.4)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Driver: Hive JDBC (version 2.3.9)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Transaction isolation: TRANSACTION_REPEATABLE_READ
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 49 rows selected (0.473 seconds)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Beeline version 2.3.9 by Apache Hive
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Closing: 0: jdbc:hive2://spark:10000
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Snapshot tables creation verified successfully.
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Canonical tables creation verified successfully.
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Connecting to jdbc:hive2://spark:10000
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Connected to: Spark SQL (version 3.3.4)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Driver: Hive JDBC (version 2.3.9)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Transaction isolation: TRANSACTION_REPEATABLE_READ
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 1 row selected (2.419 seconds)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Beeline version 2.3.9 by Apache Hive
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Closing: 0: jdbc:hive2://spark:10000
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Resource tables data verified successfully.
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Connecting to jdbc:hive2://spark:10000
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Connected to: Spark SQL (version 3.3.4)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Driver: Hive JDBC (version 2.3.9)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Transaction isolation: TRANSACTION_REPEATABLE_READ
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 1 row selected (0.348 seconds)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Beeline version 2.3.9 by Apache Hive
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": Closing: 0: jdbc:hive2://spark:10000
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": E2E TEST FOR CONTROLLER SPARK DEPLOYMENT: Updated patient data verified successfully.
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver":                                  Dload  Upload   Total   Spent    Left  Speed
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0*   Trying 192.168.10.12:8080...
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connected to pipeline-controller (192.168.10.12) port 8080 (#0)
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > POST /run?runMode=VIEWS HTTP/1.1
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Host: pipeline-controller:8080
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > User-Agent: curl/7.88.1
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Content-Type: application/json
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > Accept: */*
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": > 
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < HTTP/1.1 200 
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Content-Type: text/plain;charset=UTF-8
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Content-Length: 7
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < Date: Wed, 08 Jan 2025 05:27:06 GMT
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": < 
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": { [7 bytes data]
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": 
100     7  100     7    0     0   1097      0 --:--:-- --:--:-- --:--:--  1166
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": * Connection #0 to host pipeline-controller left intact
Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver": SUCCESSE2E TEST FOR CONTROLLER SPARK DEPLOYMENT: END!!
Finished Step #21 - "Run E2E Test for Dockerized Controller and Spark Thriftserver"
Starting Step #22 - "Bring down controller and Spark containers"
Step #22 - "Bring down controller and Spark containers": Already have image (with digest): docker/compose
Step #22 - "Bring down controller and Spark containers": The PIPELINE_CONFIG variable is not set. Defaulting to a blank string.
Step #22 - "Bring down controller and Spark containers": The DWH_ROOT variable is not set. Defaulting to a blank string.
Step #22 - "Bring down controller and Spark containers": The JAVA_OPTS variable is not set. Defaulting to a blank string.
Step #22 - "Bring down controller and Spark containers": The FHIRDATA_SINKFHIRSERVERURL variable is not set. Defaulting to a blank string.
Step #22 - "Bring down controller and Spark containers": Stopping pipeline-controller ... 
Step #22 - "Bring down controller and Spark containers": Stopping spark-thriftserver  ... 
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #22 - "Bring down controller and Spark containers": �[2A�[2K
Stopping pipeline-controller ... �[32mdone�[0m
�[2B�[1A�[2K
Stopping spark-thriftserver  ... �[32mdone�[0m
�[1BRemoving pipeline-controller ... 
Step #22 - "Bring down controller and Spark containers": Removing spark-thriftserver  ... 
Step #22 - "Bring down controller and Spark containers": �[1A�[2K
Removing spark-thriftserver  ... �[32mdone�[0m
�[1B�[2A�[2K
Removing pipeline-controller ... �[32mdone�[0m
�[2BNetwork cloudbuild is external, skipping
Step #22 - "Bring down controller and Spark containers": Removing network docker_default
Finished Step #22 - "Bring down controller and Spark containers"
Starting Step #23 - "Turn down HAPI Source Server"
Step #23 - "Turn down HAPI Source Server": Already have image (with digest): docker/compose
Step #23 - "Turn down HAPI Source Server": Network cloudbuild is external, skipping
Step #23 - "Turn down HAPI Source Server": Removing network docker_default
Step #23 - "Turn down HAPI Source Server": Network docker_default not found.
Finished Step #23 - "Turn down HAPI Source Server"
Starting Step #24 - "Turn down FHIR Sink Server Controller for e2e tests"
Step #24 - "Turn down FHIR Sink Server Controller for e2e tests": Already have image (with digest): docker/compose
Step #24 - "Turn down FHIR Sink Server Controller for e2e tests": Stopping sink-server-controller ... 
Step #24 - "Turn down FHIR Sink Server Controller for e2e tests": �[1A�[2K
Stopping sink-server-controller ... �[32mdone�[0m
�[1BRemoving sink-server-controller ... 
Step #24 - "Turn down FHIR Sink Server Controller for e2e tests": �[1A�[2K
Removing sink-server-controller ... �[32mdone�[0m
�[1BNetwork cloudbuild is external, skipping
Step #24 - "Turn down FHIR Sink Server Controller for e2e tests": Removing volume sink-server-controller_hapi-data
Finished Step #24 - "Turn down FHIR Sink Server Controller for e2e tests"
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
Step #27 - "Launch Streaming Pipeline": WAITING FOR RESOURCES TO SINK
TIMEOUT
ERROR: context deadline exceeded
Finished Step #27 - "Launch Streaming Pipeline"
Step #2 - "Launch Sink Server JDBC": �[1A�[2K
Creating sink-server-jdbc ... �[32mdone�[0m
�[1B
Step #1 - "Launch Sink Server Search": �[1A�[2K
Creating sink-server-search ... �[32mdone�[0m
�[1B
Step #0 - "Launch HAPI Source Server": �[2A�[2K
Creating hapi-server  ... �[32mdone�[0m
�[2B�[1A�[2K
Creating hapi-fhir-db ... �[32mdone�[0m
�[1B
Step #19 - "Launch HAPI FHIR Sink Server Controller": �[1A�[2K
Creating sink-server-controller ... �[32mdone�[0m
�[1B
Step #20 - "Bring up controller and Spark containers": �[1A�[2K
Creating pipeline-controller ... �[32mdone�[0m
�[1B
Step #25 - "Launch OpenMRS Server and HAPI FHIR Sink Server for OpenMRS": �[1A�[2K
Creating openmrs                 ... �[32mdone�[0m
�[1B

Build Log: https://storage.cloud.google.com/cloud-build-gh-logs/log-7d679ba2-39fc-42e7-a5bc-7d90d5e886ba.txt