827 resultados para Mining Equipment Technology Services Sector
Resumo:
Distributed Denial of Services DDoS, attacks has become one of the biggest threats for resources over Internet. Purpose of these attacks is to make servers deny from providing services to legitimate users. These attacks are also used for occupying media bandwidth. Currently intrusion detection systems can just detect the attacks but cannot prevent / track the location of intruders. Some schemes also prevent the attacks by simply discarding attack packets, which saves victim from attack, but still network bandwidth is wasted. In our opinion, DDoS requires a distributed solution to save wastage of resources. The paper, presents a system that helps us not only in detecting such attacks but also helps in tracing and blocking (to save the bandwidth as well) the multiple intruders using Intelligent Software Agents. The system gives dynamic response and can be integrated with the existing network defense systems without disturbing existing Internet model. We have implemented an agent based networking monitoring system in this regard.
Resumo:
We describe the design and implementation of a public-key platform, secFleck, based on a commodity Trusted Platform Module (TPM) chip that extends the capability of a standard node. Unlike previous software public-key implementations this approach provides E- Commerce grade security; is computationally fast, energy efficient; and has low financial cost — all essential attributes for secure large-scale sen- sor networks. We describe the secFleck message security services such as confidentiality, authenticity and integrity, and present performance re- sults including computation time, energy consumption and cost. This is followed by examples, built on secFleck, of symmetric key management, secure RPC and secure software update.
Resumo:
Communication security for wireless sensor networks (WSN) is a challenge due to the limited computation and energy resources available at nodes. We describe the design and implementation of a public-key (PK) platform based on a standard Trusted Platform Module (TPM) chip that extends the capability of a standard node. The result facilitates message security services such as confidentiality, authenticity and integrity. We present results including computation time, energy consumption and cost.
Resumo:
Technology and Nursing Practice explains and critically engages with the practice implications of technology for nursing. It takes a broad view of technology, covering not only health informatics, but also 'tele-nursing' and the use of equipment in clinical practice.
Resumo:
This report explains the objectives, datasets and evaluation criteria of both the clustering and classification tasks set in the INEX 2009 XML Mining track. The report also describes the approaches and results obtained by the different participants.
Resumo:
The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.
Resumo:
This article reports the details of a research on novel design in the field of semitrailer sector and discuss design by hazard prevention techniques. The novel design made addresses occupational health and safety (OHS)concerns of fall from heights. The research includes a detailed survey of national data sources to examine the fatalities caused due to fall from heights in car carriers. The study investigates OHS recommendations in Australia for semitrailer sector. Often injuries are caused due to drivers working above the 1.5 meter height for loading, unloading of the cars, moving the decks up, down, strapping the cars, and slipperly. The new design is developed using latest computer aided design and engineeing (CAD, CAE), product data management (PDM), virtual design process (VDP). The new car carrier design excels in reducing the risks of injuries to drivers and new bench mark for OHS standards. The new design has all the decks operated with hydraulics and uses unique ratchet lock mechanism (fool proof design) and loading happens at a safe working height (below 1.5 meter). All the cars are strapped on the safe working height, and then car desks operated hydraulically to transfer them to the required position. This also includes the car on the prime mover, which shuttles across from one deck to other using hydraulic and rack-pinion mechanisms. The novel design car carrier solves the problem of falls from height: next step would be to transfer this technology across other similar effected sectors.
Resumo:
Composite web services comprise several component web services. When a composite web service is executed centrally, a single web service engine is responsible for coordinating the execution of the components, which may create a bottleneck and degrade the overall throughput of the composite service when there are a large number of service requests. Potentially this problem can be handled by decentralizing execution of the composite web service, but this raises the issue of how to partition a composite service into groups of component services such that each group can be orchestrated by its own execution engine while ensuring acceptable overall throughput of the composite service. Here we present a novel penalty-based genetic algorithm to solve the composite web service partitioning problem. Empirical results show that our new algorithm outperforms existing heuristic-based solutions.
Resumo:
In cloud computing resource allocation and scheduling of multiple composite web services is an important challenge. This is especially so in a hybrid cloud where there may be some free resources available from private clouds but some fee-paying resources from public clouds. Meeting this challenge involves two classical computational problems. One is assigning resources to each of the tasks in the composite web service. The other is scheduling the allocated resources when each resource may be used by more than one task and may be needed at different points of time. In addition, we must consider Quality-of-Service issues, such as execution time and running costs. Existing approaches to resource allocation and scheduling in public clouds and grid computing are not applicable to this new problem. This paper presents a random-key genetic algorithm that solves new resource allocation and scheduling problem. Experimental results demonstrate the effectiveness and scalability of the algorithm.
Resumo:
Home Automation (HA) has emerged as a prominent ¯eld for researchers and in- vestors confronting the challenge of penetrating the average home user market with products and services emerging from technology based vision. In spite of many technology contri- butions, there is a latent demand for a®ordable and pragmatic assistive technologies for pro-active handling of complex lifestyle related problems faced by home users. This study has pioneered to develop an Initial Technology Roadmap for HA (ITRHA) that formulates a need based vision of 10-15 years, identifying market, product and technology investment opportunities, focusing on those aspects of HA contributing to e±cient management of home and personal life. The concept of Family Life Cycle is developed to understand the temporal needs of family. In order to formally describe a coherent set of family processes, their relationships, and interaction with external elements, a reference model named Fam- ily System is established that identi¯es External Entities, 7 major Family Processes, and 7 subsystems-Finance, Meals, Health, Education, Career, Housing, and Socialisation. Anal- ysis of these subsystems reveals Soft, Hard and Hybrid processes. Rectifying the lack of formal methods for eliciting future user requirements and reassessing evolving market needs, this study has developed a novel method called Requirement Elicitation of Future Users by Systems Scenario (REFUSS), integrating process modelling, and scenario technique within the framework of roadmapping. The REFUSS is used to systematically derive process au- tomation needs relating the process knowledge to future user characteristics identi¯ed from scenarios created to visualise di®erent futures with richly detailed information on lifestyle trends thus enabling learning about the future requirements. Revealing an addressable market size estimate of billions of dollars per annum this research has developed innovative ideas on software based products including Document Management Systems facilitating automated collection, easy retrieval of all documents, In- formation Management System automating information services and Ubiquitous Intelligent System empowering the highly mobile home users with ambient intelligence. Other product ideas include robotic devices of versatile Kitchen Hand and Cleaner Arm that can be time saving. Materialisation of these products require technology investment initiating further research in areas of data extraction, and information integration as well as manipulation and perception, sensor actuator system, tactile sensing, odour detection, and robotic controller. This study recommends new policies on electronic data delivery from service providers as well as new standards on XML based document structure and format.
Resumo:
The loosely-coupled and dynamic nature of web services architectures has many benefits, but also leads to an increased vulnerability to denial of service attacks. While many papers have surveyed and described these vulnerabilities, they are often theoretical and lack experimental data to validate them, and assume an obsolete state of web services technologies. This paper describes experiments involving several denial of service vulnerabilities in well-known web services platforms, including Java Metro, Apache Axis, and Microsoft .NET. The results both confirm and deny the presence of some of the most well-known vulnerabilities in web services technologies. Specifically, major web services platforms appear to cope well with attacks that target memory exhaustion. However, attacks targeting CPU-time exhaustion are still effective, regardless of the victim’s platform.
Resumo:
This thesis consists of three related studies: an ERP Major Issues Study; an Historical Study of the Queensland Government Financial Management System; and a Meta-Study that integrates these and other related studies conducted under the umbrella of the Cooperative ERP Lifecycle Knowledge Management research program. This research provides a comprehensive view of ERP lifecycle issues encountered in SAP R/3 projects across the Queensland Government. This study follows a preliminary ERP issues study (Chang, 2002) conducted in five Queensland Government agencies. The Major Issues Study aims to achieve the following: (1) identify / explicate major issues in relation to the ES life-cycle in the public sector; (2) rank the importance of these issues; and, (3) highlight areas of consensus and dissent among stakeholder groups. To provide a rich context for this study, this thesis includes an historical recount of the Queensland Government Financial Management System (QGFMS). This recount tells of its inception as a centralised system; the selection of SAP and subsequent decentralisation; and, its eventual recentralisation under the Shared Services Initiative and CorpTech. This historical recount gives an insight into the conditions that affected the selection and ongoing management and support of QGFMS. This research forms part of a program entitled Cooperative ERP Lifecycle Knowledge Management. This thesis provides a concluding report for this research program by summarising related studies conducted in the Queensland Government SAP context: Chan (2003); Vayo et al (2002); Ng (2003); Timbrell et al (2001); Timbrell et al (2002); Chang (2002); Putra (1998); and, Niehus et al (1998). A study of Oracle in the United Arab Emirates by Dhaheri (2002) is also included. The thesis then integrates the findings from these studies in an overarching Meta-Study. The Meta-Study discusses key themes across all of these studies, creating an holistic report for the research program. Themes discussed in the meta-study include common issues found across the related studies; knowledge dynamics of the ERP lifecycle; ERP maintenance and support; and, the relationship between the key players in the ERP lifecycle.