16 Commits

Author SHA1 Message Date
4fa8cf1a69 chore: update analytics data [skip ci] 2025-08-18 03:00:21 +00:00
0f4da3d79b add dict function for services by cloud_env (#46)
All checks were successful
Create Weekly Analytics Stats / run-analytics (push) Successful in 9s
Reviewed-on: #46
Reviewed-by: Gode, Sebastian <sebastian.gode@t-systems.com>
Co-authored-by: tischrei <tino.schreiber@t-systems.com>
Co-committed-by: tischrei <tino.schreiber@t-systems.com>
2025-08-13 13:45:17 +00:00
ed2a5f575e fix file_path (#45)
Reviewed-on: #45
Reviewed-by: Gode, Sebastian <sebastian.gode@t-systems.com>
Co-authored-by: tischrei <tino.schreiber@t-systems.com>
Co-committed-by: tischrei <tino.schreiber@t-systems.com>
2025-08-13 13:15:37 +00:00
d6c1eecd79 collect output statistics (#44)
Reviewed-on: #44
Reviewed-by: Gode, Sebastian <sebastian.gode@t-systems.com>
Co-authored-by: tischrei <tino.schreiber@t-systems.com>
Co-committed-by: tischrei <tino.schreiber@t-systems.com>
2025-08-13 12:04:22 +00:00
394cec25a5 chore: update analytics data [skip ci] (#43)
Co-authored-by: gitea-actions[bot] <actions@users.noreply.local>
Reviewed-on: #43
Reviewed-by: Gode, Sebastian <sebastian.gode@t-systems.com>
2025-08-11 09:24:30 +00:00
9848825516 change workflow (#41)
Reviewed-on: #41
Reviewed-by: Gode, Sebastian <sebastian.gode@t-systems.com>
Co-authored-by: tischrei <tino.schreiber@t-systems.com>
Co-committed-by: tischrei <tino.schreiber@t-systems.com>
2025-08-11 08:31:55 +00:00
17cd4cac60 debug workflow (#40)
Some checks failed
Create Weekly Analytics Stats / run-analytics (push) Failing after 7s
Co-authored-by: gitea-actions[bot] <actions@users.noreply.local>
Reviewed-on: #40
Co-authored-by: tischrei <tino.schreiber@t-systems.com>
Co-committed-by: tischrei <tino.schreiber@t-systems.com>
2025-08-08 14:01:24 +00:00
77b10c9729 collect_website_statics (#39)
Reviewed-on: #39
Co-authored-by: tischrei <tino.schreiber@t-systems.com>
Co-committed-by: tischrei <tino.schreiber@t-systems.com>
2025-08-08 13:40:25 +00:00
81fd29520d Update .gitea/workflows/create-weekly-analytics-stats.yaml (#38)
Reviewed-on: #38
Reviewed-by: Gode, Sebastian <sebastian.gode@t-systems.com>
2025-08-08 10:13:35 +00:00
f635b7351e add collect_statistics tool (#37)
Reviewed-on: #37
Reviewed-by: Gode, Sebastian <sebastian.gode@t-systems.com>
Co-authored-by: tischrei <tino.schreiber@t-systems.com>
Co-committed-by: tischrei <tino.schreiber@t-systems.com>
2025-08-08 10:09:41 +00:00
71b820ebf5 New service function (#36)
Reviewed-on: #36
Reviewed-by: Tino Schreiber <tino.schreiber@t-systems.com>
Co-authored-by: Sebastian Gode <sebastian.gode@telekom.de>
Co-committed-by: Sebastian Gode <sebastian.gode@telekom.de>
2025-08-04 09:36:23 +00:00
e3741c8b53 add tox py3 check
Reviewed-by: Gode, Sebastian <sebastian.gode@t-systems.com>
Co-authored-by: tischrei <tino.schreiber@t-systems.com>
Co-committed-by: tischrei <tino.schreiber@t-systems.com>
2025-07-30 12:14:34 +00:00
167f5cb883 add opensearch update workflow
Reviewed-by: Gode, Sebastian <sebastian.gode@t-systems.com>
Co-authored-by: tischrei <tino.schreiber@t-systems.com>
Co-committed-by: tischrei <tino.schreiber@t-systems.com>
2025-07-29 09:35:01 +00:00
746279eba1 disable import for bcc and sd
Reviewed-by: Hasko, Vladimir <vladimir.hasko@t-systems.com>
Co-authored-by: Sebastian Gode <sebastian.gode@telekom.de>
Co-committed-by: Sebastian Gode <sebastian.gode@telekom.de>
2025-07-23 12:16:18 +00:00
6919069569 delete business dashboard
Reviewed-by: Tino Schreiber <tino.schreiber@t-systems.com>
Co-authored-by: Sebastian Gode <sebastian.gode@telekom.de>
Co-committed-by: Sebastian Gode <sebastian.gode@telekom.de>
2025-07-23 12:04:48 +00:00
92525c56a9 fixing wrong names of dms metadata files
Reviewed-by: Tino Schreiber <tino.schreiber@t-systems.com>
Co-authored-by: Hasko, Vladimir <vladimir.hasko@t-systems.com>
Co-committed-by: Hasko, Vladimir <vladimir.hasko@t-systems.com>
2025-07-16 09:55:16 +00:00
17 changed files with 427 additions and 33 deletions

View File

@ -0,0 +1,62 @@
name: Create Weekly Analytics Stats
on:
schedule:
# 03:00 UTC = 04:00 CET
- cron: "0 3 * * 1"
workflow_dispatch:
jobs:
run-analytics:
runs-on: ubuntu
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
token: ${{ secrets.PUSH_TOKEN }}
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.12"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install requests otc-metadata
- name: Run analytics for eu_de
env:
UMAMI_USERNAME: ${{ secrets.UMAMI_USERNAME }}
UMAMI_PASSWORD: ${{ secrets.UMAMI_PASSWORD }}
run: |
python ./tools/collect_statistics.py \
--website-id "${{ secrets.UMAMI_WEBSITE_ID }}" \
--cloud-environment "eu_de" \
--environment "public" \
--limit "10"
- name: Run analytics for swiss
env:
UMAMI_USERNAME: ${{ secrets.UMAMI_USERNAME }}
UMAMI_PASSWORD: ${{ secrets.UMAMI_PASSWORD }}
run: |
python ./tools/collect_statistics.py \
--website-id "${{ secrets.UMAMI_WEBSITE_ID }}" \
--cloud-environment "swiss" \
--environment "public" \
--limit "10"
- name: Commit and push results
run: |
git config --global user.name "gitea-actions[bot]"
git config --global user.email "actions@users.noreply.local"
git checkout -B analytics-update
git add otc_metadata/analytics/
if git diff --cached --quiet; then
echo "No changes to commit"
else
git commit -m "chore: update analytics data [skip ci]"
git push origin analytics-update --force
fi

View File

@ -0,0 +1,18 @@
name: Run Tox Check
on:
pull_request:
types: [opened, reopened, synchronize, edited]
jobs:
tox-py312:
runs-on: ubuntu
steps:
- uses: https://github.com/opentelekomcloud-infra/github-actions/.github/actions/tox-py-test@v1
tox-pep8:
runs-on: ubuntu
steps:
- uses: https://github.com/opentelekomcloud-infra/github-actions/.github/actions/tox-py-test@v1
with:
tox-env: pep8

View File

@ -0,0 +1,32 @@
name: Updates Opensearch filters
on:
pull_request:
types:
- closed
branches:
- main
jobs:
update-opensearch-filters:
if: github.event.pull_request.merged == true
runs-on: ubuntu
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.12'
- name: Install dependencies and local package otc-metadata package
run: |
python -m pip install --upgrade pip
pip install . -r tools-requirements.txt
- name: Update swiss and eu_de Opensearch indizies
run: |
python tools/index_metadata.py --hosts ${{ secrets.OPENSEARCH_HOST1 }} --target-environment public --index search_index_de --cloud-environment eu_de --username ${{ secrets.OPENSEARCH_USER }} --password ${{ secrets.OPENSEARCH_PW }} --delete-index
python tools/index_metadata.py --hosts ${{ secrets.OPENSEARCH_HOST1 }} --target-environment public --index search_index_swiss --cloud-environment swiss --username ${{ secrets.OPENSEARCH_USER }} --password ${{ secrets.OPENSEARCH_PW }} --delete-index

View File

View File

@ -0,0 +1,48 @@
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from pathlib import Path
import json
BASE_DIR = Path(__file__).resolve().parent
analytics_path = BASE_DIR / "public"
cloud_environments = [
'eu_de',
'swiss'
]
analytics_data = {k: [] for k in cloud_environments}
# Open and read the json data files
for env in cloud_environments:
file_path = analytics_path / f"{env}.json"
with file_path.open(encoding="utf-8") as file:
analytics_data[env] = json.load(file)
class AnalyticsData(object):
"""Encapsulate OTC Analystics data"""
def __init__(self):
self._analytics_data = analytics_data
def all_analytics_data(self):
"""returns all analytics data"""
return self._analytics_data
def analytics_data_by_cloud_environment(self, cloud_environment):
"""returns all analytics data"""
if cloud_environment and cloud_environment in self._analytics_data:
return self._analytics_data[cloud_environment]
else:
raise ValueError(f"cloud_environment '{cloud_environment}' does not exist.")

View File

@ -0,0 +1,12 @@
[
"evs",
"ims",
"ecs",
"cce",
"obs",
"rds",
"sfs",
"iam",
"elb",
"vpn"
]

View File

@ -0,0 +1,12 @@
[
"evs",
"ims",
"ecs",
"cce",
"obs",
"rds",
"iam",
"elb",
"vpn",
"cbr"
]

View File

@ -6,6 +6,7 @@ rst_location: api-ref/source
service_type: bcc
title: API Reference
type: api-ref
disable_import: true
cloud_environments:
- name: eu_de
visibility: internal

View File

@ -6,6 +6,7 @@ rst_location: umn/source
service_type: bcc
title: User Guide
type: umn
disable_import: true
cloud_environments:
- name: eu_de
visibility: internal

View File

@ -1,13 +0,0 @@
---
hc_location: usermanual/bd
html_location: docs/bd/umn
link: /business-dashboard/umn/
rst_location: umn/source
service_type: bd
title: User Guide
type: umn
cloud_environments:
- name: eu_de
visibility: public
pdf_visibility: hidden
pdf_enabled: false

View File

@ -6,6 +6,7 @@ rst_location: umn/source
service_type: sd
title: User Guide
type: umn
disable_import: true
cloud_environments:
- name: swiss
visibility: public

View File

@ -1,8 +0,0 @@
---
service_type: bd
repositories:
- environment: internal
repo: docs/business-dashboard
type: gitea
cloud_environments:
- eu_de

View File

@ -1,12 +0,0 @@
---
service_category: other
service_title: Business Dashboard
service_type: bd
service_uri: business-dashboard
is_global: false
teams:
- name: docs-dashboard-rw
permission: write
cloud_environments:
- name: eu_de
visibility: internal

View File

@ -453,3 +453,55 @@ class Services(object):
res.sort(key=lambda x: x.get("name", "").lower())
return res
def all_services_by_cloud_environment(self, cloud_environment, environments):
"""Retrieve all services filtered by cloud_environment
"""
res = []
for srv in self.all_services:
if environments and cloud_environment:
for srv_cloud_environment in srv["cloud_environments"]:
if srv_cloud_environment["name"] == cloud_environment:
for environment in environments:
if srv_cloud_environment["visibility"] == environment:
res.append(srv)
else:
continue
else:
raise Exception("No cloud_environment or environments "
"specified in function all_services_by_cloud_environment.")
# Sort services
res.sort(key=lambda x: x.get("service_title", "").lower())
return res
def all_services_by_cloud_environment_as_dict(self, cloud_environment, environments):
"""Retrieve all services filtered by cloud_environment
Returns a dict keyed by service_type.
"""
res = {}
if not (environments and cloud_environment):
raise Exception(
"No cloud_environment or environments specified in function all_services_by_cloud_environment."
)
for srv in self.all_services:
for srv_cloud_environment in srv.get("cloud_environments", []):
if srv_cloud_environment.get("name") == cloud_environment:
for environment in environments:
if srv_cloud_environment.get("visibility") == environment:
service_type = srv.get("service_type")
if service_type:
res[service_type] = srv
break
res = dict(
sorted(
res.items(),
key=lambda item: item[1].get("service_type", "").lower()
)
)
return res

188
tools/collect_statistics.py Executable file

File diff suppressed because it is too large Load Diff