962 resultados para multiple data sources


Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Smart healthcare is a complex domain for systems integration due to human and technical factors and heterogeneous data sources involved. As a part of smart city, it is such a complex area where clinical functions require smartness of multi-systems collaborations for effective communications among departments, and radiology is one of the areas highly relies on intelligent information integration and communication. Therefore, it faces many challenges regarding integration and its interoperability such as information collision, heterogeneous data sources, policy obstacles, and procedure mismanagement. The purpose of this study is to conduct an analysis of data, semantic, and pragmatic interoperability of systems integration in radiology department, and to develop a pragmatic interoperability framework for guiding the integration. We select an on-going project at a local hospital for undertaking our case study. The project is to achieve data sharing and interoperability among Radiology Information Systems (RIS), Electronic Patient Record (EPR), and Picture Archiving and Communication Systems (PACS). Qualitative data collection and analysis methods are used. The data sources consisted of documentation including publications and internal working papers, one year of non-participant observations and 37 interviews with radiologists, clinicians, directors of IT services, referring clinicians, radiographers, receptionists and secretary. We identified four primary phases of data analysis process for the case study: requirements and barriers identification, integration approach, interoperability measurements, and knowledge foundations. Each phase is discussed and supported by qualitative data. Through the analysis we also develop a pragmatic interoperability framework that summaries the empirical findings and proposes recommendations for guiding the integration in the radiology context.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Autism Spectrum Disorder (ASD) is diagnosed on the basis of behavioral symptoms, but cognitive abilities may also be useful in characterizing individuals with ASD. One hundred seventy-eight high-functioning male adults, half with ASD and half without, completed tasks assessing IQ, a broad range of cognitive skills, and autistic and comorbid symptomatology. The aims of the study were, first, to determine whether significant differences existed between cases and controls on cognitive tasks, and whether cognitive profiles, derived using a multivariate classification method with data from multiple cognitive tasks, could distinguish between the two groups. Second, to establish whether cognitive skill level was correlated with degree of autistic symptom severity, and third, whether cognitive skill level was correlated with degree of comorbid psychopathology. Fourth, cognitive characteristics of individuals with Asperger Syndrome (AS) and high-functioning autism (HFA) were compared. After controlling for IQ, ASD and control groups scored significantly differently on tasks of social cognition, motor performance, and executive function (P's < 0.05). To investigate cognitive profiles, 12 variables were entered into a support vector machine (SVM), which achieved good classification accuracy (81%) at a level significantly better than chance (P < 0.0001). After correcting for multiple correlations, there were no significant associations between cognitive performance and severity of either autistic or comorbid symptomatology. There were no significant differences between AS and HFA groups on the cognitive tasks. Cognitive classification models could be a useful aid to the diagnostic process when used in conjunction with other data sources-including clinical history.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The past years have shown an enormous advancement in sequencing and array-based technologies, producing supplementary or alternative views of the genome stored in various formats and databases. Their sheer volume and different data scope pose a challenge to jointly visualize and integrate diverse data types. We present AmalgamScope a new interactive software tool focusing on assisting scientists with the annotation of the human genome and particularly the integration of the annotation files from multiple data types, using gene identifiers and genomic coordinates. Supported platforms include next-generation sequencing and microarray technologies. The available features of AmalgamScope range from the annotation of diverse data types across the human genome to integration of the data based on the annotational information and visualization of the merged files within chromosomal regions or the whole genome. Additionally, users can define custom transcriptome library files for any species and use the file exchanging distant server options of the tool.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Precipitation and temperature climate indices are calculated using the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis and validated against observational data from some stations over Brazil and other data sources. The spatial patterns of the climate indices trends are analyzed for the period 1961-1990 over South America. In addition, the correlation and linear regression coefficients for some specific stations were also obtained in order to compare with the reanalysis data. In general, the results suggest that NCEP/NCAR reanalysis can provide useful information about minimum temperature and consecutive dry days indices at individual grid cells in Brazil. However, some regional differences in the climate indices trends are observed when different data sets are compared. For instance, the NCEP/NCAR reanalysis shows a reversal signal for all rainfall annual indices and the cold night index over Argentina. Despite these differences, maps of the trends for most of the annual climate indices obtained from the NCEP/NCAR reanalysis and BRANT analysis are generally in good agreement with other available data sources and previous findings in the literature for large areas of southern South America. The pattern of trends for the precipitation annual indices over the 30 years analyzed indicates a change to wetter conditions over southern and southeastern parts of Brazil, Paraguay, Uruguay, central and northern Argentina, and parts of Chile and a decrease over southwestern South America. All over South America, the climate indices related to the minimum temperature (warm or cold nights) have clearly shown a warming tendency; however, no consistent changes in maximum temperature extremes (warm and cold days) have been observed. Therefore, one must be careful before suggesting an), trends for warm or cold days.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this study is to evaluate the variation of solar radiation data between different data sources that will be free and available at the Solar Energy Research Center (SERC). The comparison between data sources will be carried out for two locations: Stockholm, Sweden and Athens, Greece. For the desired locations, data is gathered for different tilt angles: 0°, 30°, 45°, 60° facing south. The full dataset is available in two excel files: “Stockholm annual irradiation” and “Athens annual irradiation”. The World Radiation Data Center (WRDC) is defined as a reference for the comparison with other dtaasets, because it has the highest time span recorded for Stockholm (1964–2010) and Athens (1964–1986), in form of average monthly irradiation, expressed in kWh/m2. The indicator defined for the data comparison is the estimated standard deviation. The mean biased error (MBE) and the root mean square error (RMSE) were also used as statistical indicators for the horizontal solar irradiation data. The variation in solar irradiation data is categorized in two categories: natural or inter-annual variability, due to different data sources and lastly due to different calculation models. The inter-annual variation for Stockholm is 140.4kWh/m2 or 14.4% and 124.3kWh/m2 or 8.0% for Athens. The estimated deviation for horizontal solar irradiation is 3.7% for Stockholm and 4.4% Athens. This estimated deviation is respectively equal to 4.5% and 3.6% for Stockholm and Athens at 30° tilt, 5.2% and 4.5% at 45° tilt, 5.9% and 7.0% at 60°. NASA’s SSE, SAM and RETScreen (respectively Satel-light) exhibited the highest deviation from WRDC’s data for Stockholm (respectively Athens). The essential source for variation is notably the difference in horizontal solar irradiation. The variation increases by 1-2% per degree of tilt, using different calculation models, as used in PVSYST and Meteonorm. The location and altitude of the data source did not directly influence the variation with the WRDC data. Further examination is suggested in order to improve the methodology of selecting the location; Examining the functional dependence of ground reflected radiation with ambient temperature; variation of ambient temperature and its impact on different solar energy systems; Im pact of variation in solar irradiation and ambient temperature on system output.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Data collecting is necessary to some organizations such as nuclear power plants and earthquake bureaus, which have very small databases. Traditional data collecting is to obtain necessary data from internal and external data-sources and join all data together to create a homogeneous huge database. Because collected data may be untrusty, it can disguise really useful patterns in data. In this paper, breaking away traditional data collecting mode that deals with internal and external data equally, we argue that the first step for utilizing external data is to identify quality data in data-sources for given mining tasks. Pre- and post-analysis techniques are thus advocated for generating quality data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Collecting, analyzing, and making Molecularbiological annotation data accessible in different public data sources is still an ongoing project. Integration of such data from these data sources might lead to valuable biological knowledge. There are numerous annotation data and only some of those are structured. The number and contents of related sources are continuously increasing. In addition, the existing data sources have their own storage structure and implementation. As a result, these could lead to a limitation in the combining of annotation. Here, we proposed a tool, called ANNODA, for integrating Molecular-biological annotation data. Unlike the past work on database interoperation in the bioinformatics community, this database design uses web-links which are very useful for interactive navigation and meanwhile it also supports automated large-scale analysis tasks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study investigated the various sources of information used by business organisations and their association with business performance indicators. The results showed that larger Australian companies predominantly used various internal and external secondary data sources of information, followed by formal primary marketing research. The study indicated an association between market share and the use of competitors, advertising agencies, and sales promotion agencies as sources of information, while overall financial performance was associated with the use of competitors and media/trade publications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The peer-to-peer content distribution network (PCDN) is a hot topic recently, and it has a huge potential for massive data intensive applications on the Internet. One of the challenges in PCDN is routing for data sources and data deliveries. In this paper, we studied a type of network model which is formed by dynamic autonomy area, structured source servers and proxy servers. Based on this network model, we proposed a number of algorithms to address the routing and data delivery issues. According to the highly dynamics of the autonomy area, we established dynamic tree structure proliferation system routing, proxy routing and resource searching algorithms. The simulations results showed that the performance of the proposed network model and the algorithms are stable.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: To compare the quality and funding source of studies concluding a negative economic impact of smoke-free policies in the hospitality industry to studies concluding no such negative impact.

Data sources: Researchers sought all studies produced before 31 August 2002. Articles published in scientific journals were located with Medline, Science Citation Index, Social Sciences Citation Index, Current Contents, PsychInfo, Econlit, and Healthstar. Unpublished studies were located from tobacco company websites and through internet searches.

Study selection:
97 studies that made statements about economic impact were included. 93% of the studies located met the selection criteria as determined by consensus between multiple reviewers.

Data extraction: Findings and characteristics of studies (apart from funding source) were classified independently by two researchers. A third assessor blind to both the objective of the present study and to funding source also classified each study.

Data synthesis: In studies concluding a negative impact, the odds of using a subjective outcome measure was 4.0 times (95% confidence interval (CI) 1.4 to 9.6; p = 0.007) and the odds of not being peer reviewed was 20 times (95% CI 2.6 to 166.7; p = 0.004) that of studies concluding no such negative impact. All of the studies concluding a negative impact were supported by the tobacco industry. 94% of the tobacco industry supported studies concluded a negative economic impact compared to none of the non-industry supported studies.

Conclusion: All of the best designed studies report no impact or a positive impact of smoke-free restaurant and bar laws on sales or employment. Policymakers can act to protect workers and patrons from the toxins in secondhand smoke confident in rejecting industry claims that there will be an adverse economic impact.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Oral diseases including dental caries and periodontal disease are among the most prevalent and costly diseases in Australia today. Around 5.4% of Australia’s health dollar is spent on dental services totalling around $2.6 billion, 84% of which are delivered through the private sector (AIHW 2001). The other 16% is spent providing public sector services in varied and inadequate ways. While disease rates among school children have declined significantly in the past 20 years the gains made among children are not flowing on to adult dentitions and our aging population will place increasing demands on an inadequate system into the future (AHMAC 2001). Around 50% of adults do not received regular care and this has implications for widening health inequalities as the greatest burden falls on lower income groups (AIHW DSRU 2001). The National Competition Policy agenda has initiated, Australia-wide, reviews of dental legislation applying to delivery of services by dentists, dental specialists, dental therapists and hygienists and dental technicians and prosthetists. The review of the Victorian Dentists Act 1972, was completed first in 1999, followed by the other Australian states with Queensland, the ACT and the Northern Territory still developing legislation. One of the objectives of the new Victorian Act is to ‘…promote access to dental care’. This study has grown out of the need to know more about how dental therapists and hygienists might be utilised to achieve this and the legislative frameworks that could enable such roles. This study used qualitative methods to explore dental health policy making associated with strategies that may increase access to dental care using dental therapists and hygienists. The study used a multiple case study design to critically examine the dental policy development process around the Review of the Dentists Act 1972 in Victoria; to assess legislative and regulatory dental policy reforms in other states in Australia and to conduct a comparative analysis of dental health policy as it relates to dental auxiliary practice internationally. Data collection has involved (I) semi-structured interviews with key participants and stakeholders in the policy development processes in Victoria, interstate and overseas, and (ii) analysis of documentary data sources. The study has taken a grounded theory approach whereby theoretical issues that emerged from the Victorian case study were further developed and challenged in the subsequent interstate and international case studies. A component of this study has required the development of indicators in regulatory models for dental hygienists and therapists that will increase access to dental care for the community. These indicators have been used to analyse regulation reform and the likely impacts in each setting. Despite evidence of need, evidence of the effectiveness and efficiency of dental therapists and hygienists, and the National Competition Policy agenda of increasing efficiency, the legislation reviews have mostly produces only minor changes. Results show that almost all Australian states have regulated dental therapists and hygienists in more prescriptive ways than they do dentists. The study has found that dental policy making is still dominated by the views of private practice dentists under elitist models that largely protect dentist authority, autonomy and sovereignty. The influence of dentist professional dominance has meant that governments have been reluctant to make sweeping changes. The study has demonstrated alternative models of regulation for dental therapists and hygienists, which would allow wider utilisation of their skills, more effective use of public sector funding, increased access to services and a grater focus on preventive care. In the light of theses outcomes, there is a need to continue to advocate for changes that will increase the public health focus of oral health care.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The impacts on the environment from human activities are of increasing concern. The need to consider the reduction in energy consumption is of particular interest, especially in the construction and operation of buildings, which accounts for between 30 and 40% of Australia's national energy consumption. Much past and more recent emphasis has been placed on methods for reducing the energy consumed in the operation of buildings. With the energy embodied in these buildings having been shown to account for an equally large proportion of a building's life cycle energy consumption, there is a need to look at ways of reducing the embodied energy of buildings and related products. Life cycle assessment (LCA) is considered to be the most appropriate tool for assessing the life cycle energy consumption of buildings and their products. The life cycle inventory analysis (LCIA) step of a LCA, where an inventory of material and energy inputs is gathered, may currently suffer from several limitations, mainly concerned with the use of incomplete and unreliable data sources and LCIA methods. These traditional methods of LCIA include process-based and input-output-based LCIA. Process-based LCIA uses process specific data, whilst input-output-based LCIA uses data produced from an analysis of the flow of goods and services between sectors of the Australian economy, also known as input-output data. With the incompleteness and unreliability of these two respective methods in mind, hybrid LCIA methods have been developed to minimise the errors associated with traditional LCIA methods, combining both process and input-output data. Hybrid LCIA methods based on process data have shown to be incomplete. Hybrid LCIA methods based on input-output data involve substituting available process data into the input-output model minimising the errors associated with process-based hybrid LCIA methods. However, until now, this LCIA method had not been tested for its level of completeness and reliability. The aim of this study was to assess the reliability and completeness of hybrid life cycle inventory analysis, as applied to the Australian construction industry. A range of case studies were selected in order to apply the input-output-based hybrid LCIA method and evaluate the subsequent results as obtained from each case study. These case studies included buildings: two commercial office buildings, two residential buildings, a recreational building; and building related products: a solar hot water system, a building integrated photovoltaic system and a washing machine. The range of building types and products selected assisted in testing the input-output-based hybrid LCIA method for its applicability across a wide range of product types. The input-output-based hybrid LCIA method was applied to each of the selected case studies in order to obtain their respective embodied energy results. These results were then evaluated with the use of a number of evaluation methods. These evaluation methods included an analysis of the difference between the process-based and input-output-based hybrid LCIA results as an evaluation of the completeness of the process-based LCIA method. The second method of evaluation used was a comparison between equivalent process and input-output values used in the input-output-based hybrid LCIA method as a measure of reliability. It was found that the results from a typical process-based LCIA and process-based hybrid LCIA have a large gap when compared to input-output-based hybrid LCIA results (up to 80%). This gap has shown that the currently available quantity of process data in Australia is insufficient. The comparison between equivalent process-based and input-output-based LCIA values showed that the input-output data does not provide a reliable representation of the equivalent process values, for material energy intensities, material inputs and whole products. Therefore, the use of input-output data to account for inadequate or missing process data is not reliable. However, as there is currently no other method for filling the gaps in traditional process-based LCIA, and as input-output data is considered to be more complete than process data, and the errors may be somewhat lower, using input-output data to fill the gaps in traditional process-based LCIA appears to be better than not using any data at all. The input-output-based hybrid LCIA method evaluated in this study has shown to be the most sophisticated and complete currently available LCIA method for assessing the environmental impacts associated with buildings and building related products. This finding is significant as the construction and operation of buildings accounts for a large proportion of national energy consumption. The use of the input-output-based hybrid LCIA method for products other than those related to the Australian construction industry may be appropriate, especially if the material inputs of the product being assessed are similar to those typically used in the construction industry. The input-output-based hybrid LCIA method has been used to correct some of the errors and limitations associated with previous LCIA methods, without the introduction of any new errors. Improvements in current input-output models are also needed, particularly to account for the inclusion of capital equipment inputs (i.e. the energy required to manufacture the machinery and other equipment used in the production of building materials, products etc.). Although further improvements in the quantity of currently available process data are also needed, this study has shown that with the current available embodied energy data for LCIA, the input-output-based hybrid LCIA appears to provide the most reliable and complete method for use in assessing the environmental impacts of the Australian construction industry.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research obtains the optimal estimation and data fusion for linear and nonlinear systems suffering from uncertain observations (missing measurements). The noise from the different data sources are considered to correlated. The derivation of the robust Kalman filter for systems subject to aditional uncertainties in the modelling parameters is presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recently, Aissa-El-Bey et al. have proposed two subspacebased methods for underdetermined blind source separation (UBSS) in time-frequency (TF) domain. These methods allow multiple active sources at TF points so long as the number of active sources at any TF point is strictly less than the number of sensors, and the column vectors of the mixing matrix are pairwise linearly independent. In this correspondence, we first show that the subspace-based methods must also satisfy the condition that any M × M submatrix of the mixing matrix is of full rank. Then we present a new UBSS approach which only requires that the number of active sources at any TF point does not exceed that of sensors. An algorithm is proposed to perform the UBSS.