Skip to content

Commit

Permalink
resolved merge conflicts
Browse files Browse the repository at this point in the history
  • Loading branch information
Raghul-M committed Jan 15, 2025
2 parents 148da03 + 0b92fee commit 128b477
Show file tree
Hide file tree
Showing 3 changed files with 52 additions and 8 deletions.
12 changes: 6 additions & 6 deletions ods_ci/tests/Resources/Page/ODH/ODHDashboard/ODHDashboard.robot
Original file line number Diff line number Diff line change
Expand Up @@ -569,8 +569,8 @@ Import New Custom Image
Sleep 1
Open Custom Image Import Popup
Input Text xpath://input[@id="byon-image-location-input"] ${repo}
Input Text xpath://input[@id="byon-image-name-input"] ${name}
Input Text xpath://input[@id="byon-image-description-input"] ${description}
Input Text xpath://input[@id="byon-image-name"] ${name}
Input Text xpath://*[@id="byon-image-description"] ${description}
# No button present anymore?
#Add Softwares To Custom Image ${software}
#Add Packages To Custom Image ${packages}
Expand Down Expand Up @@ -631,15 +631,15 @@ Delete Custom Image
${custom_image_kebab_btn}= Set Variable //td[.="${image_name}"]/../td[last()]//button
Click Button xpath:${custom_image_kebab_btn}
${image_name_id}= Replace String ${image_name} ${SPACE} -
Click Element xpath:${custom_image_kebab_btn}/..//button[@id="custom-${image_name_id}-delete-button"] # robocop: disable
Click Element xpath://button[@id="${image_name_id}-delete-button"]
Handle Deletion Confirmation Modal ${image_name} notebook image
# Wait for the image to disappear from the list
Wait Until Page Does Not Contain Element xpath:${custom_image_kebab_btn} timeout=10s
# Assure that the actual ImageStream is also removed
${rc} ${out}= Run And Return Rc And Output
... oc wait --for=delete --timeout=10s imagestream -n ${APPLICATIONS_NAMESPACE} custom-${image_name_id}
... oc wait --for=delete --timeout=10s imagestream -n ${APPLICATIONS_NAMESPACE} ${image_name_id}
IF ${rc} != ${0}
Fail msg=The ImageStream 'custom-${image_name_id}' wasn't deleted from cluster in timeout.
Fail msg=The ImageStream '${image_name_id}' wasn't deleted from cluster in timeout.
END


Expand All @@ -649,7 +649,7 @@ Open Edit Menu For Custom Image
${custom_image_kebab_btn}= Set Variable //td[.="${image_name}"]/../td[last()]//button
Click Button xpath:${custom_image_kebab_btn}
${image_name_id}= Replace String ${image_name} ${SPACE} -
Click Element xpath:${custom_image_kebab_btn}/..//button[@id="custom-${image_name_id}-edit-button"]
Click Element xpath:${custom_image_kebab_btn}/..//button[@id="${image_name_id}-edit-button"]
Wait Until Page Contains Delete Notebook Image

Expand Custom Image Details
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ Test Duplicate Image
... packages=${IMG_PACKAGES}
# Assure that the expected error message is shown in the modal window
${image_name_id}= Replace String ${IMG_NAME} ${SPACE} -
Wait Until Page Contains Unable to add notebook image: imagestreams.image.openshift.io "custom-${image_name_id}" already exists
Wait Until Page Contains Unable to add notebook image: imagestreams.image.openshift.io "${image_name_id}" already exists
# Since the image cannot be created, we need to cancel the modal window now
Click Button ${GENERIC_CANCEL_BTN_XP}
[Teardown] Duplicate Image Teardown
Expand Down Expand Up @@ -201,4 +201,4 @@ Get Standard Data Science Local Registry URL
[Documentation] Fetches the local URL for the SDS image
${registry} = Run oc get imagestream s2i-generic-data-science-notebook -n ${APPLICATIONS_NAMESPACE} -o json | jq '.status.dockerImageRepository' | sed 's/"//g' # robocop: disable
${tag} = Run oc get imagestream s2i-generic-data-science-notebook -n ${APPLICATIONS_NAMESPACE} -o json | jq '.status.tags[-1].tag' | sed 's/"//g' # robocop: disable
RETURN ${registry}:${tag}
RETURN ${registry}:${tag}
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,9 @@ ${INFERENCESERVICE_FILEPATH}= ${LLM_RESOURCES_DIRPATH}/serving_runtimes/base/
${INFERENCESERVICE_FILEPATH_NEW}= ${LLM_RESOURCES_DIRPATH}/serving_runtimes/isvc
${INFERENCESERVICE_FILLED_FILEPATH}= ${INFERENCESERVICE_FILEPATH_NEW}/isvc_filled.yaml
${KSERVE_RUNTIME_REST_NAME}= triton-kserve-runtime
${PYTORCH_MODEL_NAME}= resnet50
${INFERENCE_REST_INPUT_PYTORCH}= @tests/Resources/Files/triton/kserve-triton-resnet-rest-input.json
${EXPECTED_INFERENCE_REST_OUTPUT_FILE__PYTORCH}= tests/Resources/Files/triton/kserve-triton-resnet-rest-output.json
${PATTERN}= https:\/\/([^\/:]+)
${PROTOBUFF_FILE}= tests/Resources/Files/triton/grpc_predict_v2.proto

Expand Down Expand Up @@ -84,6 +87,47 @@ Test Python Model Rest Inference Via API (Triton on Kserve) # robocop: off=to
... AND
... Run Keyword If "${KSERVE_MODE}"=="RawDeployment" Terminate Process triton-process kill=true

Test Pytorch Model Rest Inference Via API (Triton on Kserve) # robocop: off=too-long-test-case
[Documentation] Test the deployment of python model in Kserve using Triton
[Tags] Tier2 RHOAIENG-16909
Setup Test Variables model_name=${PYTORCH_MODEL_NAME} use_pvc=${FALSE} use_gpu=${FALSE}
... kserve_mode=${KSERVE_MODE} model_path=triton/model_repository/
Set Project And Runtime runtime=${KSERVE_RUNTIME_REST_NAME} protocol=${PROTOCOL} namespace=${test_namespace}
... download_in_pvc=${DOWNLOAD_IN_PVC} model_name=${PYTORCH_MODEL_NAME}
... storage_size=100Mi memory_request=100Mi
${requests}= Create Dictionary memory=1Gi
Compile Inference Service YAML isvc_name=${PYTORCH_MODEL_NAME}
... sa_name=models-bucket-sa
... model_storage_uri=${storage_uri}
... model_format=python serving_runtime=${KSERVE_RUNTIME_REST_NAME}
... version="1"
... limits_dict=${limits} requests_dict=${requests} kserve_mode=${KSERVE_MODE}
Deploy Model Via CLI isvc_filepath=${INFERENCESERVICE_FILLED_FILEPATH}
... namespace=${test_namespace}
# File is not needed anymore after applying
Remove File ${INFERENCESERVICE_FILLED_FILEPATH}
Wait For Pods To Be Ready label_selector=serving.kserve.io/inferenceservice=${PYTORCH_MODEL_NAME}
... namespace=${test_namespace}
${pod_name}= Get Pod Name namespace=${test_namespace}
... label_selector=serving.kserve.io/inferenceservice=${PYTORCH_MODEL_NAME}
${service_port}= Extract Service Port service_name=${PYTORCH_MODEL_NAME}-predictor protocol=TCP
... namespace=${test_namespace}
IF "${KSERVE_MODE}"=="RawDeployment"
Start Port-forwarding namespace=${test_namespace} pod_name=${pod_name} local_port=${service_port}
... remote_port=${service_port} process_alias=triton-process
END
${EXPECTED_INFERENCE_REST_OUTPUT_PYTORCH}= Load Json File
... file_path=${EXPECTED_INFERENCE_REST_OUTPUT_FILE_PYTORCH} as_string=${TRUE}
Verify Model Inference With Retries model_name=${PYTORCH_MODEL_NAME} inference_input=${INFERENCE_REST_INPUT_PYTORCH}
... expected_inference_output=${EXPECTED_INFERENCE_REST_OUTPUT_PYTORCH} project_title=${test_namespace}
... deployment_mode=Cli kserve_mode=${KSERVE_MODE} service_port=${service_port}
... end_point=/v2/models/${model_name}/infer retries=3
[Teardown] Run Keywords
... Clean Up Test Project test_ns=${test_namespace}
... isvc_names=${models_names} wait_prj_deletion=${FALSE} kserve_mode=${KSERVE_MODE}
... AND
... Run Keyword If "${KSERVE_MODE}"=="RawDeployment" Terminate Process triton-process kill=true

Test Python Model Grpc Inference Via API (Triton on Kserve) # robocop: off=too-long-test-case
[Documentation] Test the deployment of python model in Kserve using Triton
[Tags] Tier2 RHOAIENG-16912
Expand Down

0 comments on commit 128b477

Please sign in to comment.