Merge branch 'influx-as-history-db'

This commit is contained in:
Alexander Zobnin
2019-02-14 14:07:32 +03:00
19 changed files with 493 additions and 145 deletions

View File

@@ -6,12 +6,11 @@ coverage:
status: status:
project: project:
default: default:
target: 30% threshold: 5
threshold: 5% patch: off
patch: no changes: off
changes: no
comment: false comment: off
ignore: ignore:
- "dist/test/test-setup/.*" - "dist/test/test-setup/.*"

View File

@@ -26,7 +26,7 @@ nav:
- 'Upgrade': 'installation/upgrade.md' - 'Upgrade': 'installation/upgrade.md'
- Configuration: - Configuration:
- 'Configuration': 'configuration/index.md' - 'Configuration': 'configuration/index.md'
- 'SQL Data Source Configuration': 'configuration/sql_datasource.md' - 'Direct DB Connection Configuration': 'configuration/direct_db_datasource.md'
- 'Provisioning': 'configuration/provisioning.md' - 'Provisioning': 'configuration/provisioning.md'
- 'Troubleshooting': 'configuration/troubleshooting.md' - 'Troubleshooting': 'configuration/troubleshooting.md'
- User Guides: - User Guides:

View File

@@ -32,3 +32,10 @@ database name (usually, `zabbix`) and specify credentials.
### Security notes ### Security notes
Make sure you use read-only user for Zabbix database. Make sure you use read-only user for Zabbix database.
## InfluxDB
Select _InfluxDB_ data source type and provide your InfluxDB instance host address and port (8086 is default). Fill
database name you configured in the [effluence](https://github.com/i-ky/effluence) module config (usually, `zabbix`) and specify credentials.
![Configure InfluxDB data source](../img/configuration-influxdb_ds_config.png)

View File

@@ -61,11 +61,12 @@ amount of data transfered.
Read [how to configure](./sql_datasource) SQL data source in Grafana. Read [how to configure](./sql_datasource) SQL data source in Grafana.
- **Enable**: enable Direct DB Connection. - **Enable**: enable Direct DB Connection.
- **SQL Data Source**: Select SQL Data Source for Zabbix database. - **Data Source**: Select Data Source for Zabbix history database.
- **Retention Policy** (InfluxDB only): Specify retention policy name for fetching long-term stored data. Grafana will fetch data from this retention policy if query time range suitable for trends query. Leave it blank if only default retention policy used.
#### Supported databases #### Supported databases
**MySQL** and **PostgreSQL** are supported by Grafana. **MySQL**, **PostgreSQL**, **InfluxDB** are supported as sources of historical data for the plugin.
### Alerting ### Alerting

View File

@@ -34,8 +34,11 @@ datasources:
disableReadOnlyUsersAck: true disableReadOnlyUsersAck: true
# Direct DB Connection options # Direct DB Connection options
dbConnectionEnable: true dbConnectionEnable: true
# Name of existing SQL datasource # Name of existing datasource for Direct DB Connection
dbConnectionDatasourceName: MySQL Zabbix dbConnectionDatasourceName: MySQL Zabbix
# Retention policy name (InfluxDB only) for fetching long-term stored data.
# Leave it blank if only default retention policy used.
dbConnectionRetentionPolicy: one_year
version: 1 version: 1
editable: false editable: false

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:216a5d1a495fbd2093446c67e09a5dc669c65f541e65fcc13802facd411b5434
size 111917

View File

@@ -1,6 +1,6 @@
# Direct DB Connection # Direct DB Connection
Since version 4.3 Grafana can use MySQL as a native data source. The Grafana-Zabbix plugin can use this data source for querying data directly from a Zabbix database. Since version 4.3 Grafana can use MySQL as a native data source. The idea of Direct DB Connection is that Grafana-Zabbix plugin can use this data source for querying data directly from a Zabbix database.
One of the most resource intensive queries for Zabbix API is the history query. For long time intervals `history.get` One of the most resource intensive queries for Zabbix API is the history query. For long time intervals `history.get`
returns a huge amount of data. In order to display it, the plugin should adjust time series resolution returns a huge amount of data. In order to display it, the plugin should adjust time series resolution
@@ -10,6 +10,8 @@ time series, but that data should be loaded and processed on the client side fir
Also, many users see better performance from direct database queries versus API calls. This could be the result of several reasons, Also, many users see better performance from direct database queries versus API calls. This could be the result of several reasons,
such as the additional PHP layer and additional SQL queries (user permissions checks). such as the additional PHP layer and additional SQL queries (user permissions checks).
Direct DB Connection feature allows using database transparently for querying historical data. Now Grafana-Zabbix plugin supports few databases for history queries: MySQL, PostgreSQL and InfluxDB. Regardless of the database type, idea and data flow remain the same.
## Data Flow ## Data Flow
This chart illustrates how the plugin uses both Zabbix API and the MySQL data source for querying different types This chart illustrates how the plugin uses both Zabbix API and the MySQL data source for querying different types
@@ -76,6 +78,24 @@ ORDER BY time ASC
As you can see, the Grafana-Zabbix plugin uses aggregation by a given time interval. This interval is provided by Grafana and depends on the panel width in pixels. Thus, Grafana displays the data in the proper resolution. As you can see, the Grafana-Zabbix plugin uses aggregation by a given time interval. This interval is provided by Grafana and depends on the panel width in pixels. Thus, Grafana displays the data in the proper resolution.
## InfluxDB
Zabbix supports loadable modules which makes possible to write history data into an external database. There's a [module](https://github.com/i-ky/effluence) for InfluxDB written by [@i-ky](https://github.com/i-ky) which can export history into InfluxDB in real-time.
#### InfluxDB retention policy
In order to keep database size under control, you should use InfluxDB retention policy mechanism. It's possible to create retention policy for long-term data and write aggregated data in the same manner as Zabbix does (trends). Then this retention policy can be used in plugin for getting data after a certain period ([Retention Policy](../../configuration/#direct-db-connection) option in data source config). Read more about how to configure retention policy for using with plugin in effluence module [docs](https://github.com/i-ky/effluence#database-sizing).
#### InfluxDB Query
Eventually, plugin generates InfluxDB query similar to this:
```sql
SELECT MEAN("value")
FROM "history"
WHERE ("itemid" = '10073' OR "itemid" = '10074')
AND "time" >= 1540000000000s AND "time" <= 1540000000060s
GROUP BY time(10s), "itemid" fill(none)
```
## Functions usage with Direct DB Connection ## Functions usage with Direct DB Connection
There's only one function that directly affects the backend data. This function is `consolidateBy`. Other functions work on the client side and transform data that comes from the backend. So you should clearly understand that this is pre-aggregated data (by AVG, MAX, MIN, etc). There's only one function that directly affects the backend data. This function is `consolidateBy`. Other functions work on the client side and transform data that comes from the backend. So you should clearly understand that this is pre-aggregated data (by AVG, MAX, MIN, etc).

View File

@@ -1,7 +1,8 @@
import _ from 'lodash'; import _ from 'lodash';
import { migrateDSConfig } from './migrations'; import { migrateDSConfig } from './migrations';
const SUPPORTED_SQL_DS = ['mysql', 'postgres']; const SUPPORTED_SQL_DS = ['mysql', 'postgres', 'influxdb'];
const zabbixVersions = [ const zabbixVersions = [
{ name: '2.x', value: 2 }, { name: '2.x', value: 2 },
{ name: '3.x', value: 3 }, { name: '3.x', value: 3 },
@@ -27,18 +28,25 @@ export class ZabbixDSConfigController {
this.current.jsonData = migrateDSConfig(this.current.jsonData); this.current.jsonData = migrateDSConfig(this.current.jsonData);
_.defaults(this.current.jsonData, defaultConfig); _.defaults(this.current.jsonData, defaultConfig);
this.sqlDataSources = this.getSupportedSQLDataSources(); this.dbDataSources = this.getSupportedDBDataSources();
this.zabbixVersions = _.cloneDeep(zabbixVersions); this.zabbixVersions = _.cloneDeep(zabbixVersions);
this.autoDetectZabbixVersion(); this.autoDetectZabbixVersion();
console.log(this.dbDataSources);
} }
getSupportedSQLDataSources() { getSupportedDBDataSources() {
let datasources = this.datasourceSrv.getAll(); let datasources = this.datasourceSrv.getAll();
return _.filter(datasources, ds => { return _.filter(datasources, ds => {
return _.includes(SUPPORTED_SQL_DS, ds.type); return _.includes(SUPPORTED_SQL_DS, ds.type);
}); });
} }
getCurrentDatasourceType() {
const dsId = this.current.jsonData.dbConnectionDatasourceId;
const currentDs = _.find(this.dbDataSources, { 'id': dsId });
return currentDs ? currentDs.type : null;
}
autoDetectZabbixVersion() { autoDetectZabbixVersion() {
if (!this.current.id) { if (!this.current.id) {
return; return;

View File

@@ -1,4 +1,5 @@
import _ from 'lodash'; import _ from 'lodash';
import config from 'grafana/app/core/config';
import * as dateMath from 'grafana/app/core/utils/datemath'; import * as dateMath from 'grafana/app/core/utils/datemath';
import * as utils from './utils'; import * as utils from './utils';
import * as migrations from './migrations'; import * as migrations from './migrations';
@@ -18,6 +19,8 @@ export class ZabbixDatasource {
this.templateSrv = templateSrv; this.templateSrv = templateSrv;
this.zabbixAlertingSrv = zabbixAlertingSrv; this.zabbixAlertingSrv = zabbixAlertingSrv;
this.enableDebugLog = config.buildInfo.env === 'development';
// Use custom format for template variables // Use custom format for template variables
this.replaceTemplateVars = _.partial(replaceTemplateVars, this.templateSrv); this.replaceTemplateVars = _.partial(replaceTemplateVars, this.templateSrv);
@@ -55,6 +58,7 @@ export class ZabbixDatasource {
this.enableDirectDBConnection = jsonData.dbConnectionEnable || false; this.enableDirectDBConnection = jsonData.dbConnectionEnable || false;
this.dbConnectionDatasourceId = jsonData.dbConnectionDatasourceId; this.dbConnectionDatasourceId = jsonData.dbConnectionDatasourceId;
this.dbConnectionDatasourceName = jsonData.dbConnectionDatasourceName; this.dbConnectionDatasourceName = jsonData.dbConnectionDatasourceName;
this.dbConnectionRetentionPolicy = jsonData.dbConnectionRetentionPolicy;
let zabbixOptions = { let zabbixOptions = {
url: this.url, url: this.url,
@@ -66,10 +70,11 @@ export class ZabbixDatasource {
cacheTTL: this.cacheTTL, cacheTTL: this.cacheTTL,
enableDirectDBConnection: this.enableDirectDBConnection, enableDirectDBConnection: this.enableDirectDBConnection,
dbConnectionDatasourceId: this.dbConnectionDatasourceId, dbConnectionDatasourceId: this.dbConnectionDatasourceId,
dbConnectionDatasourceName: this.dbConnectionDatasourceName dbConnectionDatasourceName: this.dbConnectionDatasourceName,
dbConnectionRetentionPolicy: this.dbConnectionRetentionPolicy,
}; };
this.zabbix = new Zabbix(zabbixOptions, backendSrv, datasourceSrv); this.zabbix = new Zabbix(zabbixOptions, datasourceSrv, backendSrv);
} }
//////////////////////// ////////////////////////
@@ -165,11 +170,21 @@ export class ZabbixDatasource {
* Query target data for Metrics mode * Query target data for Metrics mode
*/ */
queryNumericData(target, timeRange, useTrends, options) { queryNumericData(target, timeRange, useTrends, options) {
let queryStart, queryEnd;
let getItemOptions = { let getItemOptions = {
itemtype: 'num' itemtype: 'num'
}; };
return this.zabbix.getItemsFromTarget(target, getItemOptions) return this.zabbix.getItemsFromTarget(target, getItemOptions)
.then(items => this.queryNumericDataForItems(items, target, timeRange, useTrends, options)); .then(items => {
queryStart = new Date().getTime();
return this.queryNumericDataForItems(items, target, timeRange, useTrends, options);
}).then(result => {
queryEnd = new Date().getTime();
if (this.enableDebugLog) {
console.log(`Datasource::Performance Query Time (${this.name}): ${queryEnd - queryStart}`);
}
return result;
});
} }
/** /**
@@ -603,8 +618,8 @@ export class ZabbixDatasource {
let useTrendsFrom = Math.ceil(dateMath.parse('now-' + this.trendsFrom) / 1000); let useTrendsFrom = Math.ceil(dateMath.parse('now-' + this.trendsFrom) / 1000);
let useTrendsRange = Math.ceil(utils.parseInterval(this.trendsRange) / 1000); let useTrendsRange = Math.ceil(utils.parseInterval(this.trendsRange) / 1000);
let useTrends = this.trends && ( let useTrends = this.trends && (
(timeFrom <= useTrendsFrom) || (timeFrom < useTrendsFrom) ||
(timeTo - timeFrom >= useTrendsRange) (timeTo - timeFrom > useTrendsRange)
); );
return useTrends; return useTrends;
} }

View File

@@ -91,23 +91,39 @@
checked="ctrl.current.jsonData.dbConnectionEnable"> checked="ctrl.current.jsonData.dbConnectionEnable">
</gf-form-switch> </gf-form-switch>
<div ng-if="ctrl.current.jsonData.dbConnectionEnable"> <div ng-if="ctrl.current.jsonData.dbConnectionEnable">
<div class="gf-form max-width-20"> <div class="gf-form max-width-30">
<span class="gf-form-label width-12"> <span class="gf-form-label width-12">
SQL Data Source Data Source
<info-popover mode="right-normal"> <info-popover mode="right-normal">
Select SQL Data Source for Zabbix database. Select Data Source for Zabbix history database.
In order to use this feature you should <a href="/datasources/new" target="_blank">create</a> and In order to use this feature it should be <a href="/datasources/new" target="_blank">created</a> and
configure it first. Zabbix plugin uses this data source for querying history data directly from database. configured first. Zabbix plugin uses this data source for querying history data directly from the database.
This way usually faster than pulling data from Zabbix API, especially on the wide time ranges, and reduces This way usually faster than pulling data from Zabbix API, especially on the wide time ranges, and reduces
amount of data transfered. amount of data transfered.
</info-popover> </info-popover>
</span> </span>
<div class="gf-form-select-wrapper max-width-16"> <div class="gf-form-select-wrapper max-width-16">
<select class="gf-form-input" ng-model="ctrl.current.jsonData.dbConnectionDatasourceId" <select class="gf-form-input" ng-model="ctrl.current.jsonData.dbConnectionDatasourceId"
ng-options="ds.id as ds.name for ds in ctrl.sqlDataSources"> ng-options="ds.id as ds.name for ds in ctrl.dbDataSources">
</select> </select>
</div>
</div> </div>
</div> </div>
<div ng-if="ctrl.getCurrentDatasourceType() === 'influxdb'">
<div class="gf-form max-width-30">
<span class="gf-form-label width-12">
Retention Policy
<info-popover mode="right-normal">
Specify retention policy name for fetching long-term stored data (optional).
Leave it blank if only default retention policy used.
</info-popover>
</span>
<input class="gf-form-input max-width-16"
type="text"
ng-model='ctrl.current.jsonData.dbConnectionRetentionPolicy'
placeholder="Retention policy name">
</input>
</div>
</div> </div>
</div> </div>

View File

@@ -56,7 +56,7 @@ describe('ZabbixDatasource', () => {
}); });
it('should use trends if it enabled and time more than trendsFrom', (done) => { it('should use trends if it enabled and time more than trendsFrom', (done) => {
let ranges = ['now-7d', 'now-168h', 'now-1M', 'now-1y']; let ranges = ['now-8d', 'now-169h', 'now-1M', 'now-1y'];
_.forEach(ranges, range => { _.forEach(ranges, range => {
ctx.options.range.from = range; ctx.options.range.from = range;
@@ -73,7 +73,7 @@ describe('ZabbixDatasource', () => {
}); });
it('shouldnt use trends if it enabled and time less than trendsFrom', (done) => { it('shouldnt use trends if it enabled and time less than trendsFrom', (done) => {
let ranges = ['now-6d', 'now-167h', 'now-1h', 'now-30m', 'now-30s']; let ranges = ['now-7d', 'now-168h', 'now-1h', 'now-30m', 'now-30s'];
_.forEach(ranges, range => { _.forEach(ranges, range => {
ctx.options.range.from = range; ctx.options.range.from = range;

View File

@@ -1,9 +1,8 @@
import mocks from '../../test-setup/mocks'; import mocks from '../../test-setup/mocks';
import DBConnector from '../zabbix/connectors/dbConnector'; import { DBConnector } from '../zabbix/connectors/dbConnector';
describe('DBConnector', () => { describe('DBConnector', () => {
let ctx = {}; let ctx = {};
const backendSrv = mocks.backendSrvMock;
const datasourceSrv = mocks.datasourceSrvMock; const datasourceSrv = mocks.datasourceSrvMock;
datasourceSrv.loadDatasource.mockResolvedValue({ id: 42, name: 'foo', meta: {} }); datasourceSrv.loadDatasource.mockResolvedValue({ id: 42, name: 'foo', meta: {} });
datasourceSrv.getAll.mockReturnValue([{ id: 42, name: 'foo' }]); datasourceSrv.getAll.mockReturnValue([{ id: 42, name: 'foo' }]);
@@ -16,18 +15,18 @@ describe('DBConnector', () => {
}; };
}); });
it('should load datasource by name by default', () => { it('should try to load datasource by name first', () => {
ctx.options = { ctx.options = {
datasourceName: 'bar' datasourceName: 'bar'
}; };
const dbConnector = new DBConnector(ctx.options, backendSrv, datasourceSrv); const dbConnector = new DBConnector(ctx.options, datasourceSrv);
dbConnector.loadDBDataSource(); dbConnector.loadDBDataSource();
expect(datasourceSrv.getAll).not.toHaveBeenCalled(); expect(datasourceSrv.getAll).not.toHaveBeenCalled();
expect(datasourceSrv.loadDatasource).toHaveBeenCalledWith('bar'); expect(datasourceSrv.loadDatasource).toHaveBeenCalledWith('bar');
}); });
it('should load datasource by id if name not present', () => { it('should load datasource by id if name not present', () => {
const dbConnector = new DBConnector(ctx.options, backendSrv, datasourceSrv); const dbConnector = new DBConnector(ctx.options, datasourceSrv);
dbConnector.loadDBDataSource(); dbConnector.loadDBDataSource();
expect(datasourceSrv.getAll).toHaveBeenCalled(); expect(datasourceSrv.getAll).toHaveBeenCalled();
expect(datasourceSrv.loadDatasource).toHaveBeenCalledWith('foo'); expect(datasourceSrv.loadDatasource).toHaveBeenCalledWith('foo');
@@ -35,14 +34,14 @@ describe('DBConnector', () => {
it('should throw error if no name and id specified', () => { it('should throw error if no name and id specified', () => {
ctx.options = {}; ctx.options = {};
const dbConnector = new DBConnector(ctx.options, backendSrv, datasourceSrv); const dbConnector = new DBConnector(ctx.options, datasourceSrv);
return expect(dbConnector.loadDBDataSource()).rejects.toBe('SQL Data Source name should be specified'); return expect(dbConnector.loadDBDataSource()).rejects.toBe('Data Source name should be specified');
}); });
it('should throw error if datasource with given id is not found', () => { it('should throw error if datasource with given id is not found', () => {
ctx.options.datasourceId = 45; ctx.options.datasourceId = 45;
const dbConnector = new DBConnector(ctx.options, backendSrv, datasourceSrv); const dbConnector = new DBConnector(ctx.options, datasourceSrv);
return expect(dbConnector.loadDBDataSource()).rejects.toBe('SQL Data Source with ID 45 not found'); return expect(dbConnector.loadDBDataSource()).rejects.toBe('Data Source with ID 45 not found');
}); });
}); });
}); });

View File

@@ -0,0 +1,111 @@
import { InfluxDBConnector } from '../zabbix/connectors/influxdb/influxdbConnector';
import { compactQuery } from '../utils';
describe('InfluxDBConnector', () => {
let ctx = {};
beforeEach(() => {
ctx.options = { datasourceName: 'InfluxDB DS', retentionPolicy: 'longterm' };
ctx.datasourceSrvMock = {
loadDatasource: jest.fn().mockResolvedValue(
{ id: 42, name: 'InfluxDB DS', meta: {} }
),
};
ctx.influxDBConnector = new InfluxDBConnector(ctx.options, ctx.datasourceSrvMock);
ctx.influxDBConnector.invokeInfluxDBQuery = jest.fn().mockResolvedValue([]);
ctx.defaultQueryParams = {
itemids: ['123', '234'],
range: { timeFrom: 15000, timeTill: 15100 },
intervalSec: 5,
table: 'history', aggFunction: 'MAX'
};
});
describe('When building InfluxDB query', () => {
it('should build proper query', () => {
const { itemids, range, intervalSec, table, aggFunction } = ctx.defaultQueryParams;
const query = ctx.influxDBConnector.buildHistoryQuery(itemids, table, range, intervalSec, aggFunction);
const expected = compactQuery(`SELECT MAX("value")
FROM "history" WHERE ("itemid" = '123' OR "itemid" = '234') AND "time" >= 15000s AND "time" <= 15100s
GROUP BY time(5s), "itemid" fill(none)
`);
expect(query).toBe(expected);
});
it('should use MEAN instead of AVG', () => {
const { itemids, range, intervalSec, table } = ctx.defaultQueryParams;
const aggFunction = 'AVG';
const query = ctx.influxDBConnector.buildHistoryQuery(itemids, table, range, intervalSec, aggFunction);
const expected = compactQuery(`SELECT MEAN("value")
FROM "history" WHERE ("itemid" = '123' OR "itemid" = '234') AND "time" >= 15000s AND "time" <= 15100s
GROUP BY time(5s), "itemid" fill(none)
`);
expect(query).toBe(expected);
});
});
describe('When invoking InfluxDB query', () => {
it('should query proper table depending on item type', () => {
const { timeFrom, timeTill } = ctx.defaultQueryParams.range;
const options = { intervalMs: 5000 };
const items = [
{ itemid: '123', value_type: 3 }
];
const expectedQuery = compactQuery(`SELECT MEAN("value")
FROM "history_uint" WHERE ("itemid" = '123') AND "time" >= 15000s AND "time" <= 15100s
GROUP BY time(5s), "itemid" fill(none)
`);
ctx.influxDBConnector.getHistory(items, timeFrom, timeTill, options);
expect(ctx.influxDBConnector.invokeInfluxDBQuery).toHaveBeenCalledWith(expectedQuery);
});
it('should split query if different item types are used', () => {
const { timeFrom, timeTill } = ctx.defaultQueryParams.range;
const options = { intervalMs: 5000 };
const items = [
{ itemid: '123', value_type: 0 },
{ itemid: '234', value_type: 3 },
];
const sharedQueryPart = `AND "time" >= 15000s AND "time" <= 15100s GROUP BY time(5s), "itemid" fill(none)`;
const expectedQueryFirst = compactQuery(`SELECT MEAN("value")
FROM "history" WHERE ("itemid" = '123') ${sharedQueryPart}
`);
const expectedQuerySecond = compactQuery(`SELECT MEAN("value")
FROM "history_uint" WHERE ("itemid" = '234') ${sharedQueryPart}
`);
ctx.influxDBConnector.getHistory(items, timeFrom, timeTill, options);
expect(ctx.influxDBConnector.invokeInfluxDBQuery).toHaveBeenCalledTimes(2);
expect(ctx.influxDBConnector.invokeInfluxDBQuery).toHaveBeenNthCalledWith(1, expectedQueryFirst);
expect(ctx.influxDBConnector.invokeInfluxDBQuery).toHaveBeenNthCalledWith(2, expectedQuerySecond);
});
it('should use the same table for trends query if no retention policy set', () => {
ctx.influxDBConnector.retentionPolicy = '';
const { timeFrom, timeTill } = ctx.defaultQueryParams.range;
const options = { intervalMs: 5000 };
const items = [
{ itemid: '123', value_type: 3 }
];
const expectedQuery = compactQuery(`SELECT MEAN("value")
FROM "history_uint" WHERE ("itemid" = '123') AND "time" >= 15000s AND "time" <= 15100s
GROUP BY time(5s), "itemid" fill(none)
`);
ctx.influxDBConnector.getTrends(items, timeFrom, timeTill, options);
expect(ctx.influxDBConnector.invokeInfluxDBQuery).toHaveBeenCalledWith(expectedQuery);
});
it('should use retention policy name for trends query if it was set', () => {
const { timeFrom, timeTill } = ctx.defaultQueryParams.range;
const options = { intervalMs: 5000 };
const items = [
{ itemid: '123', value_type: 3 }
];
const expectedQuery = compactQuery(`SELECT MEAN("value")
FROM "longterm"."history_uint" WHERE ("itemid" = '123') AND "time" >= 15000s AND "time" <= 15100s
GROUP BY time(5s), "itemid" fill(none)
`);
ctx.influxDBConnector.getTrends(items, timeFrom, timeTill, options);
expect(ctx.influxDBConnector.invokeInfluxDBQuery).toHaveBeenCalledWith(expectedQuery);
});
});
});

View File

@@ -258,6 +258,13 @@ export function parseVersion(version) {
return { major, minor, patch, meta }; return { major, minor, patch, meta };
} }
/**
* Replaces any space-like symbols (tabs, new lines, spaces) by single whitespace.
*/
export function compactQuery(query) {
return query.replace(/\s+/g, ' ').trim();
}
// Fix for backward compatibility with lodash 2.4 // Fix for backward compatibility with lodash 2.4
if (!_.includes) { if (!_.includes) {
_.includes = _.contains; _.includes = _.contains;

View File

@@ -1,12 +1,40 @@
import _ from 'lodash'; import _ from 'lodash';
export const DEFAULT_QUERY_LIMIT = 10000;
export const HISTORY_TO_TABLE_MAP = {
'0': 'history',
'1': 'history_str',
'2': 'history_log',
'3': 'history_uint',
'4': 'history_text'
};
export const TREND_TO_TABLE_MAP = {
'0': 'trends',
'3': 'trends_uint'
};
export const consolidateByFunc = {
'avg': 'AVG',
'min': 'MIN',
'max': 'MAX',
'sum': 'SUM',
'count': 'COUNT'
};
export const consolidateByTrendColumns = {
'avg': 'value_avg',
'min': 'value_min',
'max': 'value_max',
'sum': 'num*value_avg' // sum of sums inside the one-hour trend period
};
/** /**
* Base class for external history database connectors. Subclasses should implement `getHistory()`, `getTrends()` and * Base class for external history database connectors. Subclasses should implement `getHistory()`, `getTrends()` and
* `testDataSource()` methods, which describe how to fetch data from source other than Zabbix API. * `testDataSource()` methods, which describe how to fetch data from source other than Zabbix API.
*/ */
export default class DBConnector { export class DBConnector {
constructor(options, backendSrv, datasourceSrv) { constructor(options, datasourceSrv) {
this.backendSrv = backendSrv;
this.datasourceSrv = datasourceSrv; this.datasourceSrv = datasourceSrv;
this.datasourceId = options.datasourceId; this.datasourceId = options.datasourceId;
this.datasourceName = options.datasourceName; this.datasourceName = options.datasourceName;
@@ -14,26 +42,33 @@ export default class DBConnector {
this.datasourceTypeName = null; this.datasourceTypeName = null;
} }
loadDBDataSource() { static loadDatasource(dsId, dsName, datasourceSrv) {
if (!this.datasourceName && this.datasourceId !== undefined) { if (!dsName && dsId !== undefined) {
let ds = _.find(this.datasourceSrv.getAll(), {'id': this.datasourceId}); let ds = _.find(datasourceSrv.getAll(), {'id': dsId});
if (!ds) { if (!ds) {
return Promise.reject(`SQL Data Source with ID ${this.datasourceId} not found`); return Promise.reject(`Data Source with ID ${dsId} not found`);
} }
this.datasourceName = ds.name; dsName = ds.name;
} }
if (this.datasourceName) { if (dsName) {
return this.datasourceSrv.loadDatasource(this.datasourceName) return datasourceSrv.loadDatasource(dsName);
.then(ds => {
this.datasourceTypeId = ds.meta.id;
this.datasourceTypeName = ds.meta.name;
return ds;
});
} else { } else {
return Promise.reject(`SQL Data Source name should be specified`); return Promise.reject(`Data Source name should be specified`);
} }
} }
loadDBDataSource() {
return DBConnector.loadDatasource(this.datasourceId, this.datasourceName, this.datasourceSrv)
.then(ds => {
this.datasourceTypeId = ds.meta.id;
this.datasourceTypeName = ds.meta.name;
if (!this.datasourceName) {
this.datasourceName = ds.name;
}
return ds;
});
}
/** /**
* Send test request to datasource in order to ensure it's working. * Send test request to datasource in order to ensure it's working.
*/ */
@@ -54,6 +89,10 @@ export default class DBConnector {
getTrends() { getTrends() {
throw new ZabbixNotImplemented('getTrends()'); throw new ZabbixNotImplemented('getTrends()');
} }
handleGrafanaTSResponse(history, items, addHostName = true) {
return convertGrafanaTSResponse(history, items, addHostName);
}
} }
// Define Zabbix DB Connector exception type for non-implemented methods // Define Zabbix DB Connector exception type for non-implemented methods
@@ -68,3 +107,48 @@ export class ZabbixNotImplemented {
return this.message; return this.message;
} }
} }
/**
* Converts time series returned by the data source into format that Grafana expects
* time_series is Array of series:
* ```
* [{
* name: string,
* points: Array<[value: number, timestamp: number]>
* }]
* ```
*/
function convertGrafanaTSResponse(time_series, items, addHostName) {
//uniqBy is needed to deduplicate
var hosts = _.uniqBy(_.flatten(_.map(items, 'hosts')), 'hostid');
let grafanaSeries = _.map(_.compact(time_series), series => {
let itemid = series.name;
var item = _.find(items, {'itemid': itemid});
var alias = item.name;
//only when actual multi hosts selected
if (_.keys(hosts).length > 1 && addHostName) {
var host = _.find(hosts, {'hostid': item.hostid});
alias = host.name + ": " + alias;
}
// CachingProxy deduplicates requests and returns one time series for equal queries.
// Clone is needed to prevent changing of series object shared between all targets.
let datapoints = _.cloneDeep(series.points);
return {
target: alias,
datapoints: datapoints
};
});
return _.sortBy(grafanaSeries, 'target');
}
const defaults = {
DBConnector,
DEFAULT_QUERY_LIMIT,
HISTORY_TO_TABLE_MAP,
TREND_TO_TABLE_MAP,
consolidateByFunc,
consolidateByTrendColumns
};
export default defaults;

View File

@@ -0,0 +1,112 @@
import _ from 'lodash';
import { compactQuery } from '../../../utils';
import { DBConnector, HISTORY_TO_TABLE_MAP, consolidateByFunc } from '../dbConnector';
export class InfluxDBConnector extends DBConnector {
constructor(options, datasourceSrv) {
super(options, datasourceSrv);
this.retentionPolicy = options.retentionPolicy;
super.loadDBDataSource().then(ds => {
this.influxDS = ds;
return ds;
});
}
/**
* Try to invoke test query for one of Zabbix database tables.
*/
testDataSource() {
return this.influxDS.testDatasource();
}
getHistory(items, timeFrom, timeTill, options) {
let { intervalMs, consolidateBy, retentionPolicy } = options;
const intervalSec = Math.ceil(intervalMs / 1000);
const range = { timeFrom, timeTill };
consolidateBy = consolidateBy || 'avg';
const aggFunction = consolidateByFunc[consolidateBy] || consolidateBy;
// Group items by value type and perform request for each value type
const grouped_items = _.groupBy(items, 'value_type');
const promises = _.map(grouped_items, (items, value_type) => {
const itemids = _.map(items, 'itemid');
const table = HISTORY_TO_TABLE_MAP[value_type];
const query = this.buildHistoryQuery(itemids, table, range, intervalSec, aggFunction, retentionPolicy);
return this.invokeInfluxDBQuery(query);
});
return Promise.all(promises)
.then(_.flatten)
.then(results => {
return handleInfluxHistoryResponse(results);
});
}
getTrends(items, timeFrom, timeTill, options) {
options.retentionPolicy = this.retentionPolicy;
return this.getHistory(items, timeFrom, timeTill, options);
}
buildHistoryQuery(itemids, table, range, intervalSec, aggFunction, retentionPolicy) {
const { timeFrom, timeTill } = range;
const measurement = retentionPolicy ? `"${retentionPolicy}"."${table}"` : `"${table}"`;
const AGG = aggFunction === 'AVG' ? 'MEAN' : aggFunction;
const where_clause = this.buildWhereClause(itemids);
const query = `SELECT ${AGG}("value") FROM ${measurement}
WHERE ${where_clause} AND "time" >= ${timeFrom}s AND "time" <= ${timeTill}s
GROUP BY time(${intervalSec}s), "itemid" fill(none)`;
return compactQuery(query);
}
buildWhereClause(itemids) {
const itemidsWhere = itemids.map(itemid => `"itemid" = '${itemid}'`).join(' OR ');
return `(${itemidsWhere})`;
}
invokeInfluxDBQuery(query) {
return this.influxDS._seriesQuery(query)
.then(data => data && data.results ? data.results : []);
}
}
///////////////////////////////////////////////////////////////////////////////
function handleInfluxHistoryResponse(results) {
if (!results) {
return [];
}
const seriesList = [];
for (let i = 0; i < results.length; i++) {
const result = results[i];
if (result.error) {
const error = `InfluxDB error: ${result.error}`;
return Promise.reject(new Error(error));
}
if (!result || !result.series) {
continue;
}
const influxSeriesList = results[i].series;
for (let y = 0; y < influxSeriesList.length; y++) {
const influxSeries = influxSeriesList[y];
const datapoints = [];
if (influxSeries.values) {
for (i = 0; i < influxSeries.values.length; i++) {
datapoints[i] = [influxSeries.values[i][1], influxSeries.values[i][0]];
}
}
const timeSeries = {
name: influxSeries.tags.itemid,
points: datapoints
};
seriesList.push(timeSeries);
}
}
return seriesList;
}

View File

@@ -1,51 +1,26 @@
import _ from 'lodash'; import _ from 'lodash';
import { compactQuery } from '../../../utils';
import mysql from './mysql'; import mysql from './mysql';
import postgres from './postgres'; import postgres from './postgres';
import DBConnector from '../dbConnector'; import dbConnector, { DBConnector, DEFAULT_QUERY_LIMIT, HISTORY_TO_TABLE_MAP, TREND_TO_TABLE_MAP } from '../dbConnector';
const supportedDatabases = { const supportedDatabases = {
mysql: 'mysql', mysql: 'mysql',
postgres: 'postgres' postgres: 'postgres'
}; };
const DEFAULT_QUERY_LIMIT = 10000;
const HISTORY_TO_TABLE_MAP = {
'0': 'history',
'1': 'history_str',
'2': 'history_log',
'3': 'history_uint',
'4': 'history_text'
};
const TREND_TO_TABLE_MAP = {
'0': 'trends',
'3': 'trends_uint'
};
const consolidateByFunc = {
'avg': 'AVG',
'min': 'MIN',
'max': 'MAX',
'sum': 'SUM',
'count': 'COUNT'
};
const consolidateByTrendColumns = {
'avg': 'value_avg',
'min': 'value_min',
'max': 'value_max',
'sum': 'num*value_avg' // sum of sums inside the one-hour trend period
};
export class SQLConnector extends DBConnector { export class SQLConnector extends DBConnector {
constructor(options, backendSrv, datasourceSrv) { constructor(options, datasourceSrv) {
super(options, backendSrv, datasourceSrv); super(options, datasourceSrv);
this.limit = options.limit || DEFAULT_QUERY_LIMIT; this.limit = options.limit || DEFAULT_QUERY_LIMIT;
this.sqlDialect = null; this.sqlDialect = null;
super.loadDBDataSource() super.loadDBDataSource()
.then(() => this.loadSQLDialect()); .then(ds => {
this.backendSrv = ds.backendSrv;
this.loadSQLDialect();
});
} }
loadSQLDialect() { loadSQLDialect() {
@@ -69,7 +44,7 @@ export class SQLConnector extends DBConnector {
let intervalSec = Math.ceil(intervalMs / 1000); let intervalSec = Math.ceil(intervalMs / 1000);
consolidateBy = consolidateBy || 'avg'; consolidateBy = consolidateBy || 'avg';
let aggFunction = consolidateByFunc[consolidateBy]; let aggFunction = dbConnector.consolidateByFunc[consolidateBy];
// Group items by value type and perform request for each value type // Group items by value type and perform request for each value type
let grouped_items = _.groupBy(items, 'value_type'); let grouped_items = _.groupBy(items, 'value_type');
@@ -78,7 +53,7 @@ export class SQLConnector extends DBConnector {
let table = HISTORY_TO_TABLE_MAP[value_type]; let table = HISTORY_TO_TABLE_MAP[value_type];
let query = this.sqlDialect.historyQuery(itemids, table, timeFrom, timeTill, intervalSec, aggFunction); let query = this.sqlDialect.historyQuery(itemids, table, timeFrom, timeTill, intervalSec, aggFunction);
query = compactSQLQuery(query); query = compactQuery(query);
return this.invokeSQLQuery(query); return this.invokeSQLQuery(query);
}); });
@@ -92,7 +67,7 @@ export class SQLConnector extends DBConnector {
let intervalSec = Math.ceil(intervalMs / 1000); let intervalSec = Math.ceil(intervalMs / 1000);
consolidateBy = consolidateBy || 'avg'; consolidateBy = consolidateBy || 'avg';
let aggFunction = consolidateByFunc[consolidateBy]; let aggFunction = dbConnector.consolidateByFunc[consolidateBy];
// Group items by value type and perform request for each value type // Group items by value type and perform request for each value type
let grouped_items = _.groupBy(items, 'value_type'); let grouped_items = _.groupBy(items, 'value_type');
@@ -100,10 +75,10 @@ export class SQLConnector extends DBConnector {
let itemids = _.map(items, 'itemid').join(', '); let itemids = _.map(items, 'itemid').join(', ');
let table = TREND_TO_TABLE_MAP[value_type]; let table = TREND_TO_TABLE_MAP[value_type];
let valueColumn = _.includes(['avg', 'min', 'max', 'sum'], consolidateBy) ? consolidateBy : 'avg'; let valueColumn = _.includes(['avg', 'min', 'max', 'sum'], consolidateBy) ? consolidateBy : 'avg';
valueColumn = consolidateByTrendColumns[valueColumn]; valueColumn = dbConnector.consolidateByTrendColumns[valueColumn];
let query = this.sqlDialect.trendsQuery(itemids, table, timeFrom, timeTill, intervalSec, aggFunction, valueColumn); let query = this.sqlDialect.trendsQuery(itemids, table, timeFrom, timeTill, intervalSec, aggFunction, valueColumn);
query = compactSQLQuery(query); query = compactQuery(query);
return this.invokeSQLQuery(query); return this.invokeSQLQuery(query);
}); });
@@ -112,10 +87,6 @@ export class SQLConnector extends DBConnector {
}); });
} }
handleGrafanaTSResponse(history, items, addHostName = true) {
return convertGrafanaTSResponse(history, items, addHostName);
}
invokeSQLQuery(query) { invokeSQLQuery(query) {
let queryDef = { let queryDef = {
refId: 'A', refId: 'A',
@@ -142,33 +113,3 @@ export class SQLConnector extends DBConnector {
}); });
} }
} }
///////////////////////////////////////////////////////////////////////////////
function convertGrafanaTSResponse(time_series, items, addHostName) {
//uniqBy is needed to deduplicate
var hosts = _.uniqBy(_.flatten(_.map(items, 'hosts')), 'hostid');
let grafanaSeries = _.map(_.compact(time_series), series => {
let itemid = series.name;
var item = _.find(items, {'itemid': itemid});
var alias = item.name;
//only when actual multi hosts selected
if (_.keys(hosts).length > 1 && addHostName) {
var host = _.find(hosts, {'hostid': item.hostid});
alias = host.name + ": " + alias;
}
// CachingProxy deduplicates requests and returns one time series for equal queries.
// Clone is needed to prevent changing of series object shared between all targets.
let datapoints = _.cloneDeep(series.points);
return {
target: alias,
datapoints: datapoints
};
});
return _.sortBy(grafanaSeries, 'target');
}
function compactSQLQuery(query) {
return query.replace(/\s+/g, ' ');
}

View File

@@ -1,10 +1,12 @@
import _ from 'lodash'; import _ from 'lodash';
import * as utils from '../utils'; import * as utils from '../utils';
import responseHandler from '../responseHandler'; import responseHandler from '../responseHandler';
import { ZabbixAPIConnector } from './connectors/zabbix_api/zabbixAPIConnector';
import { SQLConnector } from './connectors/sql/sqlConnector';
import { CachingProxy } from './proxy/cachingProxy'; import { CachingProxy } from './proxy/cachingProxy';
import { ZabbixNotImplemented } from './connectors/dbConnector'; import { ZabbixNotImplemented } from './connectors/dbConnector';
import { DBConnector } from './connectors/dbConnector';
import { ZabbixAPIConnector } from './connectors/zabbix_api/zabbixAPIConnector';
import { SQLConnector } from './connectors/sql/sqlConnector';
import { InfluxDBConnector } from './connectors/influxdb/influxdbConnector';
const REQUESTS_TO_PROXYFY = [ const REQUESTS_TO_PROXYFY = [
'getHistory', 'getTrend', 'getGroups', 'getHosts', 'getApps', 'getItems', 'getMacros', 'getItemsByIDs', 'getHistory', 'getTrend', 'getGroups', 'getHosts', 'getApps', 'getItems', 'getMacros', 'getItemsByIDs',
@@ -23,7 +25,7 @@ const REQUESTS_TO_BIND = [
]; ];
export class Zabbix { export class Zabbix {
constructor(options, backendSrv, datasourceSrv) { constructor(options, datasourceSrv, backendSrv) {
let { let {
url, url,
username, username,
@@ -35,6 +37,7 @@ export class Zabbix {
enableDirectDBConnection, enableDirectDBConnection,
dbConnectionDatasourceId, dbConnectionDatasourceId,
dbConnectionDatasourceName, dbConnectionDatasourceName,
dbConnectionRetentionPolicy,
} = options; } = options;
this.enableDirectDBConnection = enableDirectDBConnection; this.enableDirectDBConnection = enableDirectDBConnection;
@@ -48,19 +51,32 @@ export class Zabbix {
this.zabbixAPI = new ZabbixAPIConnector(url, username, password, zabbixVersion, basicAuth, withCredentials, backendSrv); this.zabbixAPI = new ZabbixAPIConnector(url, username, password, zabbixVersion, basicAuth, withCredentials, backendSrv);
if (enableDirectDBConnection) {
let dbConnectorOptions = {
datasourceId: dbConnectionDatasourceId,
datasourceName: dbConnectionDatasourceName
};
this.dbConnector = new SQLConnector(dbConnectorOptions, backendSrv, datasourceSrv);
this.getHistoryDB = this.cachingProxy.proxyfyWithCache(this.dbConnector.getHistory, 'getHistory', this.dbConnector);
this.getTrendsDB = this.cachingProxy.proxyfyWithCache(this.dbConnector.getTrends, 'getTrends', this.dbConnector);
}
this.proxyfyRequests(); this.proxyfyRequests();
this.cacheRequests(); this.cacheRequests();
this.bindRequests(); this.bindRequests();
if (enableDirectDBConnection) {
const connectorOptions = { dbConnectionRetentionPolicy };
this.initDBConnector(dbConnectionDatasourceId, dbConnectionDatasourceName, datasourceSrv, connectorOptions)
.then(() => {
this.getHistoryDB = this.cachingProxy.proxyfyWithCache(this.dbConnector.getHistory, 'getHistory', this.dbConnector);
this.getTrendsDB = this.cachingProxy.proxyfyWithCache(this.dbConnector.getTrends, 'getTrends', this.dbConnector);
});
}
}
initDBConnector(datasourceId, datasourceName, datasourceSrv, options) {
return DBConnector.loadDatasource(datasourceId, datasourceName, datasourceSrv)
.then(ds => {
let connectorOptions = { datasourceId, datasourceName };
if (ds.type === 'influxdb') {
connectorOptions.retentionPolicy = options.dbConnectionRetentionPolicy;
this.dbConnector = new InfluxDBConnector(connectorOptions, datasourceSrv);
} else {
this.dbConnector = new SQLConnector(connectorOptions, datasourceSrv);
}
return this.dbConnector;
});
} }
proxyfyRequests() { proxyfyRequests() {

View File

@@ -54,6 +54,12 @@ jest.mock('grafana/app/core/table_model', () => {
}; };
}, {virtual: true}); }, {virtual: true});
jest.mock('grafana/app/core/config', () => {
return {
buildInfo: { env: 'development' }
};
}, {virtual: true});
jest.mock('jquery', () => 'module not found', {virtual: true}); jest.mock('jquery', () => 'module not found', {virtual: true});
// Required for loading angularjs // Required for loading angularjs