forked from DataDog/integrations-core
-
Notifications
You must be signed in to change notification settings - Fork 1
Catching up #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
Ninja-mann
wants to merge
6,230
commits into
Ninja-mann:master
Choose a base branch
from
DataDog:master
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Catching up #1
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
* added additonal screenshots, updated dasboard * updated plaid logs client with 2 additional API endpoints, added metrics client * added additonal screenshots, updated dashboard * updated log files * updated log files and images * updated log files and images * updated log files and images
* adds new disk read metrics, and fixes some memory and cpu metrics * remoce disk read metrics * changelog * Update constants.py
* adds new disk read metrics, and fixes some memory and cpu metrics * remoce disk read metrics * changelog * Add new metrics for disk reads for sacct * Rename 20230.added to 20231.added * Update constants.py * Update common.py * Update common.py * Update common.py * Update common.py --------- Co-authored-by: Kyle Neale <kyle.neale@da 10000 tadoghq.com>
* Pin all GitHub Actions to the current version and add a PR check to validate not unpinned versions are used * Remove duplicated workflow for pin hash and use the same version for PR creation action * Exclude DataDog static analyzer GHA from requiring pinned hash
Co-authored-by: nubtron <nubtron@users.noreply.github.com>
* fix slurm partition * changelog * Update constants.py Revert this change. Should be handled in another PR * metadata sort * Update metadata.csv * Update slurm/changelog.d/20169.fixed Co-authored-by: Kyle Neale <kyle.neale@datadoghq.com> * fix tests --------- Co-authored-by: Kyle Neale <kyle.neale@datadoghq.com>
Co-authored-by: Sarah Witt <sarah.witt@datadoghq.com>
* [Release] Bumped slurm version to 1.2.0 * [Release] Update metadata
* test * testing * test * test arangodb * update master.yml * revert
* Unpin MapR * test
* metadata entries * units
* naming consistency * changelog * Update metadata.csv * fix metadata
* Structural changes to order, pre-reqs (profile,keys,rbac), and add troubleshooting auth command * Clarify context, prereqs, and agent sidecar setup * Fix all subintegrations, links, raw API Key * Fix some typos and eng notes * edits * post doc review updates * Remove unnecessary containerPort step for DogStatsD/APM * Fix non-ascii hyphen * Update README.md --------- Co-authored-by: cecilia saixue watt <cecilia.watt@datadoghq.com> Co-authored-by: Steven Blumenthal <steven.blumenthal@datadoghq.com>
* Add exclude_hostname to Postgres and MySQL specs * Validate
Co-authored-by: Zhengda Lu <zhengda.lu@datadoghq.com>
* Added bug fix * Pulled out the section needed to be tested * Added tests * Passing tests * Added changelog * Formatting --------- Co-authored-by: aldrick.castro <aldrick.castro@d>
* add kubevirt tile folder * update kubevirt_api manifest and move dashboard to kubevirt tile * delete configuration path * ddev labeler sync * update README * add troubleshooting title * update changelog * fix validation issues * fix json issue in manifest, and metadata csv metric short name * Update kubevirt/README.md Co-authored-by: Esther Kim <esther.kim@datadoghq.com> * Update kubevirt/README.md Co-authored-by: Esther Kim <esther.kim@datadoghq.com> * rename new kubevirt dashboard to avoid collision with existing one * remove metrics from metadata.csv * Revert "remove metrics from metadata.csv" This reverts commit 9d0fd06. * sort metadata.csv metrics * add kubevirt tile to metrics metadata validation exclude list * fix ddev metrics exclusion to metrics instead of openmetrics block --------- Co-authored-by: Esther Kim <esther.kim@datadoghq.com>
* poc test first pass * log events * logging * run_job_loop, not start * params correction * rpc_events xml parsing basic * batch_events and share utils * timestamp and timing implementation * event file implement * fix file path * return complete xml * parse xml on client side * time parsing and query section seperately * convert string to bytes * now test sqlserver parsing * remove sqlserver parsing version * missing statement from rpc_events * print event payload * fix json parsing * add event source to event payload * implement error events * remove config * test start time timestamp calculation * make allen test check more loose * log host and session id as well * delete log * delete correct log * use resolved hostname * try to detect ring buffer event loss * more visibility on timestamp gaps * do not limit max events for testing * temp increase of max events * fill in dbm_type based on event session name * implement sql statement events * implement sp statement events * combine query completions to a single event session * refactors * implement attention events * remove joined event handlers, add query start timing data * clean up * clean up * more clean up * RQT and obfuscate queries first pass * get query completion timestamp into rqt event * better timing data * add more logging * remove caching for now to get visibility for debugging * calculate raw query signature * normalize timestamps * add xe_type * fix event_name for error events * add query_signature to non-RQT event * refactor obfuscating logic * clean up dead code * consolidate more code * normalize timestamp for timestamp filtering * simplify timestamp filtering * fix timestamp gap logging * simplify event logging * omit duration and query_start from query error RQT * omit in XE event too * refactors * missed path fix * add sql fields back * explicitly state sql fields expected for each event session * move raw query signature calculation * implement configuration * unit test first pass * change imports * import change * add handlers test * fix stub import * don't mock event handler * mock keys return dict * fix tests * timestamp mock fixes * TimeMock class * avoid mocking time.time * refactors * fix expected types in rqt event * module end test * space in file name!! * add attention test * fix attention test * add integration test * send events to datadog * check if sleep makes test consistent * debug test * fix cursor call * grant select to datadog user * grant to bob * wrong setup * delete extra vars * log all calls * run check * follow activity.py pattern * fix event type * debug logging * fix config * refactor test * remove sleep * enable cache, add timestamp test * fix happy path test * linter fixes part 1 * linters part 2 * concat strings for linter * delete statement level event files * Add database instance to events * batch events for query_completion and query_errors * fix unit test serialization and add test for checking batching logic * add method tracking and code clean up * add change log * fix conditional logging * remove timing data now that we have tracked methods * log ANY first rqt event * validate config * fix import * license fix * validate models * make collection interval a number, not int * fix unit tests * update all setup scripts to set up XE sessions * add query visibility into error * clean up code * add raw query signature to query completion and error * revert to execute with retries * debug pipeline, only run on 2022 sqlserver * use convert syntax for adodbapi * add back 2019 sqlserver version * address review comments * delete dead code * parse XML only once * linter * add configurable max events * linter * validate config
Co-authored-by: tirthraj.chaudhari <tirthraj.chaudhari@crestdatasys.com>
* Update README.md * Update <CONTAINER_NAME> * Update kubernetes_cluster_autoscaler/README.md Co-authored-by: Lénaïc Huard <L3n41c@users.noreply.github.com> --------- Co-authored-by: Lénaïc Huard <L3n41c@users.noreply.github.com>
* Added metric prefix changes * Added monitor name changes * Removed check metric as per review comment * Reverting the change regarding the check metric * Added changes as per the check metric in manifest * Added suffix for metric
* Add: Kaspersky Integration v1.0.0 * Update: labeler.yml * Update: add event_type in tests and facets in log pipeline yaml * Update: dashboards, pipeline and images * Update: minor changes in dashboard top list * Update: rename changelog file * Update: CODEOWNDERS
* Update dependencies * update fdb in ci * test fdb * update docker image * update kafka * revert to patch version * update install kerberos * fix fdb * update changelog numbers * revert docker image update * bump mysql * validate license * revert kafka changes * fix kafka and move urllib3 comment * test --sync * revert redis upgrade --------- Co-authored-by: FlorentClarret <1266346+FlorentClarret@users.noreply.github.com> Co-authored-by: Sarah Witt <sarah.witt@datadoghq.com> Co-authored-by: steveny91 <steven.yuen@datadoghq.com>
…ith target_info enabled (#20555)
* Increase clarity in the scontrol param * changelog * doc changes * Update slurm/README.md Co-authored-by: Bryce Eadie <bryce.eadie@datadoghq.com> * Update slurm/README.md Co-authored-by: Bryce Eadie <bryce.eadie@datadoghq.com> --------- Co-authored-by: Bryce Eadie <bryce.eadie@datadoghq.com>
… step in provisioning (#20296) * Initial, minimal, workato manifest as first step in provisioning * Apply suggestions from code review @urseberry Thanks for the edits! Co-authored-by: Ursula Chen <58821586+urseberry@users.noreply.github.com> * Workato integration: remove stub IMAGES_README * Incorporate no-assets review feedback * Fix formatting of setup instructions * Update workato/README.md Co-authored-by: Ursula Chen <58821586+urseberry@users.noreply.github.com> * Apply suggestions from code review Apply editorial updates Co-authored-by: Ursula Chen <58821586+urseberry@users.noreply.github.com> * Workato: No-assets update * Workato: Induce build pipeline * Workato: Remove trailing space * Workato: update labeler.yml * Workato: update CODEOWNERS --------- Co-authored-by: Brian Williams <brian@mayalane.com> Co-authored-by: Ursula Chen <58821586+urseberry@users.noreply.github.com>
Seems to have been missed, just an initialization line.
…source field (#20559) * always emit session_id as integer * remove event source field * linter * fix unit tests * linter again * add changelog
* Add assets for the iboss integration * Resolve validate manifest issue * Update test results * Resolve metric_name sorting issue in metadata * Update changelog * Add monitors and dashboard images * Update dashboard images * Update dashboard * minor changes * Minor readme update * minor dashboard query change * Add real-time dashboard analytics from logs * Resolve CI failure * minor dashboards update * Minor pipeline update * Address PR comments * Address PR comments * Changes related to exclude cpu_load_avg metrics * Address review comments --------- Co-authored-by: Shubham Vekariya <shubham.vekariya@crestdata.ai> Co-authored-by: akaila-crest <abhi.kaila@crestdata.ai>
Adding Docker and Integrations to the Containerized Configuration section.
* Modernize bs4 interface Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com> * Modernize bs4 interface Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com> --------- Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>
Co-authored-by: Sarah Witt <sarah.witt@datadoghq.com>
* fix typo in process doc * Update process/README.md Co-authored-by: Jen Gilbert <j.h.gilbert@gmail.com> --------- Co-authored-by: Jen Gilbert <j.h.gilbert@gmail.com>
Remove references to agents<6
…name (#19310) * fix: Test command line result before comparing it to Gunicorn master name * Update gunicorn/datadog_checks/gunicorn/gunicorn.py --------- Co-authored-by: Sarah Witt <sarah.witt@datadoghq.com>
* Add clarity around the gunicorn.workers definitions * Update gunicorn/metadata.csv
…st otel semantic conventions for collector metrics (#20571)
Co-authored-by: github-merge-queue[bot] <github-merge-queue[bot]@users.noreply.github.com> Co-authored-by: Sarah Witt <sarah.witt@datadoghq.com>
* Update README.md to include enabling logs The previous documentation didn't include the note that log collection had to be enabled. My customer didn't have logs enabled in their test environment and had to debug to see the issue. * Update pan_firewall/README.md Co-authored-by: Jen Gilbert <jen.gilbert@datadoghq.com> * Changed link format to numbered Added numbered link 10 to log documentation --------- Co-authored-by: Jen Gilbert <jen.gilbert@datadoghq.com>
…0429) * Initial commit without assets * Added description in manifest file
Working on adding support for macOS AArch64/ARM64 "native" build, I faced that: ```py Using existing wheel --> cm_client-45.0.4-py3-none-any.whl --> confluent_kafka-2.8.0-cp312-cp312-macosx_11_0_universal2.whl Traceback (most recent call last): [...] NotImplementedError: This function does not support separate values per-architecture: {'x86_64': [('/usr/lib/libSystem.B.dylib', '1.0.0', '1336.61.1')], 'arm64': [('/Users/runner/builder_root/prefix/lib/librdkafka.1.dylib', '0.0.0', '0.0.0'), ('/usr/lib/libSystem.B.dylib', '1.0.0', '1336.61.1')]} ``` See: https://github.com/DataDog/integrations-core/actions/runs/15850271495/job/44681764771?pr=20455#step:7:17535 It turns out this issue got resolved through: - matthew-brett/delocate#230 ... and shipped with `delocate` **0.13.0**: - https://github.com/matthew-brett/delocate/releases/tag/0.13.0 - https://pypi.org/project/delocate/0.13.0/ The present change aims at taking advantage of it, which indeed seems to work as intended: ```py Using existing wheel --> cm_client-45.0.4-py3-none-any.whl --> confluent_kafka-2.8.0-cp312-cp312-macosx_11_0_universal2.whl Repaired wheel Libraries copied into the wheel: /Users/runner/builder_root/prefix/lib/librdkafka.1.dylib /Users/runner/builder_root/prefix/lib/liblmdb.so /Users/runner/builder_root/prefix/lib/libcurl.4.dylib /Users/runner/builder_root/prefix/lib/libssl.3.dylib /Users/runner/builder_root/prefix/lib/libcrypto.3.dylib /Users/runner/builder_root/prefix/lib/libz.1.3.1.dylib ``` See: https://github.com/DataDog/integrations-core/actions/runs/15851063188/job/44684393512?pr=20455#step:7:17513
Co-authored-by: github-merge-queue[bot] <github-merge-queue[bot]@users.noreply.github.com>
* Update dependencies * Remove confluent-kafka bump * Fix changelog entries * Remove vSphere --------- Co-authored-by: Kyle-Neale <kyle.neale@datadoghq.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What does this PR do?
Motivation
Additional Notes
Review checklist (to be filled by reviewers)
changelog/
andintegration/
labels attached