Dirty time-oriented data: whereabouts and bottlenecks

An interdisciplinary and interactive session on the handling of imperfect time-stamped information sets
16-17 September 2014


In conjunction with i-KNOW 2014
14th International Conference on Knowledge Technologies and Data-driven Business, Graz, Austria.


Context and motivation
In a number of application fields, pieces of data and information need to be modelled and visualised so as to analyse patterns of change over time, or so as to pinpoint time-related causal chains. Briefly said, understanding facts often implies understanding processes that lead to facts, or that derive from facts – and reasoning on processes implies mastering the time parameter.
However, coping with dirty time-oriented data (inaccurate, incomplete, erroneous, contradictory, etc.) remains, by and large, an open issue. If scientists and practitioners are to foster the emergence of effective solutions, it is of great importance they get an opportunity to confront their approaches, ideas, experiences and methods.
The workshop’s ambition is to foster interdisciplinary scientific exchanges both on theoretical or technological aspects and on practical cases/feedbacks stemming from a wide range of application fields.

Graz, Austria, (Messe Congress Graz , www.mcg.at);
in conjunction with i-KNOW 2014 , Graz University of Technology, Know Center, i-know.tugraz.at

Event format and expected outcomes:
The event will combine a “traditional” workshop format – keynotes focusing on dirty time-oriented data as such and talks focusing either on theoretical or technological aspects or on feedbacks from a variety of application fields – and a collective initiative aimed at pulling together test cases and resources (interdisciplinary references on one hand, and on the other hand test benches freely usable by the community in its subsequent work). Expected outcomes of this workshop accordingly include a better acquaintance with relevant work across scientific domains and application fields as well as the emergence of ad-hoc metrics – an attempt to ease the task of cross-examining responses to data quality issues. Ultimately, the initiative wishes to contribute to the strengthening of the community of analysts dealing with quality issues in the context of temporal data and information sets.