966 resultados para large classes


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Previous studies have found high temperatures increase the risk of mortality in summer. However, little is known about whether a sharp decrease or increase in temperature between neighbouring days has any effect on mortality. Method: Poisson regression models were used to estimate the association between temperature change and mortality in summer in Brisbane, Australia during 1996–2004 and Los Angeles, United States during 1987–2000. The temperature change was calculated as the current day’s mean temperature minus the previous day’s mean. Results: In Brisbane, a drop of more than 3 °C in temperature between days was associated with relative risks (RRs) of 1.157 (95% confidence interval (CI): 1.024, 1.307) for total non external mortality (NEM), 1.186 (95%CI: 1.002, 1.405) for NEM in females, and 1.442 (95%CI: 1.099, 1.892) for people aged 65–74 years. An increase of more than 3 °C was associated with RRs of 1.353 (95%CI: 1.033, 1.772) for cardiovascular mortality and 1.667 (95%CI: 1.146, 2.425) for people aged < 65 years. In Los Angeles, only a drop of more than 3 °C was significantly associated with RRs of 1.133 (95%CI: 1.053, 1.219) for total NEM, 1.252 (95%CI: 1.131, 1.386) for cardiovascular mortality, and 1.254 (95%CI: 1.135, 1.385) for people aged ≥75 years. In both cities, there were joint effects of temperature change and mean temperature on NEM. Conclusion : A significant change in temperature of more than 3 °C, whether positive or negative, has an adverse impact on mortality even after controlling for the current temperature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the multi-view approach to semisupervised learning, we choose one predictor from each of multiple hypothesis classes, and we co-regularize our choices by penalizing disagreement among the predictors on the unlabeled data. We examine the co-regularization method used in the co-regularized least squares (CoRLS) algorithm, in which the views are reproducing kernel Hilbert spaces (RKHS's), and the disagreement penalty is the average squared difference in predictions. The final predictor is the pointwise average of the predictors from each view. We call the set of predictors that can result from this procedure the co-regularized hypothesis class. Our main result is a tight bound on the Rademacher complexity of the co-regularized hypothesis class in terms of the kernel matrices of each RKHS. We find that the co-regularization reduces the Rademacher complexity by an amount that depends on the distance between the two views, as measured by a data dependent metric. We then use standard techniques to bound the gap between training error and test error for the CoRLS algorithm. Experimentally, we find that the amount of reduction in complexity introduced by co regularization correlates with the amount of improvement that co-regularization gives in the CoRLS algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effective implementation of such an ISO 9001 Quality Management System (QMS) in construction companies requires a proper and full implementation of the system to allow companies to improve the way they operate, by this means increasing profitability and market share, producing innovative and sustainable construction products, or improving employee and customer satisfaction. In light of this, this paper discusses the current status of QMS implementation, particularly related to the twenty elements of ISO 9001 within the grade 7 (G-7) category of Indonesian construction companies. A survey was conducted involving 403 respondents from 77 companies, to solicit an evaluation of the current implementation levels of the ISO 9001 elements. The survey findings indicated that for a large percentage of the sector surveyed they had ‘not so fully implemented’ the elements. Scrutiny of the data had also indicated elements that are ‘minimally implemented’, whilst none of the elements fell in the category of ‘fully implemented’. Based on these findings, it is suggested that the G-7 contractors may need to fully commit to practicing control of customer-supplied product and statistical techniques in order to enhance an effective implementation of ISO 9001 elements for ensuring better quality performance. These two elements are recognized as the least implemented of the quality elements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While a number of factors have been highlighted in the innovation adoption literature, little is known about whether different factors are related to innovation adoption in differently-sized firms. We used preliminary case studies of small, medium and large firms to ground our hypotheses, which were then tested using a survey of 94 firms. We found that external stakeholder pressure and non-financial readiness were related to innovation adoption in SMEs; but that for large firms, adoption was related to the opportunity to innovate. It may be that the difficulties of adopting innovations, including both the financial cost and the effort involved, are too great for SMEs to overcome unless there is either a compelling need (external pressure) or enough in-house capability (non-financial readiness). This suggests that SMEs are more likely to have innovation “pushed” onto them while large firms are more likely to “pull” innovations when they have the opportunity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data preprocessing is widely recognized as an important stage in anomaly detection. This paper reviews the data preprocessing techniques used by anomaly-based network intrusion detection systems (NIDS), concentrating on which aspects of the network traffic are analyzed, and what feature construction and selection methods have been used. Motivation for the paper comes from the large impact data preprocessing has on the accuracy and capability of anomaly-based NIDS. The review finds that many NIDS limit their view of network traffic to the TCP/IP packet headers. Time-based statistics can be derived from these headers to detect network scans, network worm behavior, and denial of service attacks. A number of other NIDS perform deeper inspection of request packets to detect attacks against network services and network applications. More recent approaches analyze full service responses to detect attacks targeting clients. The review covers a wide range of NIDS, highlighting which classes of attack are detectable by each of these approaches. Data preprocessing is found to predominantly rely on expert domain knowledge for identifying the most relevant parts of network traffic and for constructing the initial candidate set of traffic features. On the other hand, automated methods have been widely used for feature extraction to reduce data dimensionality, and feature selection to find the most relevant subset of features from this candidate set. The review shows a trend toward deeper packet inspection to construct more relevant features through targeted content parsing. These context sensitive features are required to detect current attacks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acoustic sensors play an important role in augmenting the traditional biodiversity monitoring activities carried out by ecologists and conservation biologists. With this ability however comes the burden of analysing large volumes of complex acoustic data. Given the complexity of acoustic sensor data, fully automated analysis for a wide range of species is still a significant challenge. This research investigates the use of citizen scientists to analyse large volumes of environmental acoustic data in order to identify bird species. Specifically, it investigates ways in which the efficiency of a user can be improved through the use of species identification tools and the use of reputation models to predict the accuracy of users with unidentified skill levels. Initial experimental results are reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has recently been noted a rapid increase in research attention to projects that involve outside partners. Our knowledge of such inter-organizational projects, however, is limited. This paper reports large scale data from a repeated trend survey amongst 2000 SMEs in 2006 and 2009 that focused on inter-organizational project ventures. Our major findings indicate that the overall prevalence of inter-organizational project ventures remained significant and stable over time, even despite the economic crisis. Moreover, we find that these ventures predominantly solve repetitive rather than unique tasks and are embedded in prior relations between the partnering organizations. These findings provide empirical support for the recent claims that project management should pay more attention to inter-organizational forms of project organization, and suggest that the archetypical view of projects as being unique in every respect should be reconsidered. Both have important implications for project management, especially in the area of project-based learning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experimental and theoretical studies have shown the importance of stochastic processes in genetic regulatory networks and cellular processes. Cellular networks and genetic circuits often involve small numbers of key proteins such as transcriptional factors and signaling proteins. In recent years stochastic models have been used successfully for studying noise in biological pathways, and stochastic modelling of biological systems has become a very important research field in computational biology. One of the challenge problems in this field is the reduction of the huge computing time in stochastic simulations. Based on the system of the mitogen-activated protein kinase cascade that is activated by epidermal growth factor, this work give a parallel implementation by using OpenMP and parallelism across the simulation. Special attention is paid to the independence of the generated random numbers in parallel computing, that is a key criterion for the success of stochastic simulations. Numerical results indicate that parallel computers can be used as an efficient tool for simulating the dynamics of large-scale genetic regulatory networks and cellular processes

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Anecdotal evidence suggests that the lifecycle-wide management of Enterprise System (ES) related knowledge is critical for ES health and longevity. At a time where many ES-vendors now offering solutions to Small and Medium size organizations, this paper investigates the ability of Small and Medium size organizations to maintain a lifecycle-wide knowledge management strategy. The paper explores the alleged differences in the knowledge management practices across 27 small, medium and large organizations that had implemented a market-leading ES. Results suggest that: (1) despite similar knowledge creation efforts in all three organizational sizes, small organizations struggle with retaining, transferring and applying the knowledge. The study also reveals that, (2) the overall goodness of the knowledge management process in larger organizations remains higher than their small and medium counterparts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we pursue the task of aligning an ensemble of images in an unsupervised manner. This task has been commonly referred to as “congealing” in literature. A form of congealing, using a least-squares criteria, has been recently demonstrated to have desirable properties over conventional congealing. Least-squares congealing can be viewed as an extension of the Lucas & Kanade (LK)image alignment algorithm. It is well understood that the alignment performance for the LK algorithm, when aligning a single image with another, is theoretically and empirically equivalent for additive and compositional warps. In this paper we: (i) demonstrate that this equivalence does not hold for the extended case of congealing, (ii) characterize the inherent drawbacks associated with least-squares congealing when dealing with large numbers of images, and (iii) propose a novel method for circumventing these limitations through the application of an inverse-compositional strategy that maintains the attractive properties of the original method while being able to handle very large numbers of images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mismanagement of large-scale, complex projects has resulted in spectacular failures, cost overruns, time blowouts, and stakeholder dissatisfaction. We focus discussion on the interaction of key management and leadership attributes which facilitate leaders’ adaptive behaviors. These behaviors should in turn influence adaptive team member behavior, stakeholder engagement and successful project outcomes, outputs and impacts. An understanding of this type of management will benefit from a perspective based in managerial and organizational cognition. The research question we explore is whether successful leaders of large-scale complex projects have an internal process leading to a display of administrative, adaptive, and enabling behaviors that foster adaptive processes and enabling behaviors within their teams and with external stakeholders. At the core of the model we propose interactions of key attributes, namely cognitive flexibility, affect, and emotional intelligence. The result of these cognitive-affective attribute interactions is leadership leading to enhanced likelihood of complex project success.