972 resultados para modeling tools
Resumo:
In recent years, the advent of new tools for musculoskeletal simulation has increased the potential for significantly improving the ergonomic design process and ergonomic assessment of design. In this paper we investigate the use of one such tool, ‘The AnyBody Modeling System’, applied to solve a one-parameter and yet, complex ergonomic design problem. The aim of this paper is to investigate the potential of computer-aided musculoskeletal modelling in the ergonomic design process, in the same way as CAE technology has been applied to engineering design.
Resumo:
Companies and their services are being increasingly exposed to global business networks and Internet-based ondemand services. Much of the focus is on flexible orchestration and consumption of services, beyond ownership and operational boundaries of services. However, ways in which third-parties in the “global village” can seamlessly self-create new offers out of existing services remains open. This paper proposes a framework for service provisioning in global business networks that allows an open-ended set of techniques for extending services through a rich, multi-tooling environment. The Service Provisioning Management Framework, as such, supports different modeling techniques, through supportive tools, allowing different parts of services to be integrated into new contexts. Integration of service user interfaces, business processes, operational interfaces and business object are supported. The integration specifications that arise from service extensions are uniformly reflected through a kernel technique, the Service Integration Technique. Thus, the framework preserves coherence of service provisioning tasks without constraining the modeling techniques needed for extending different aspects of services.
Resumo:
Virtual prototyping emerges as a new technology to replace existing physical prototypes for product evaluation, which are costly and time consuming to manufacture. Virtualization technology allows engineers and ergonomists to perform virtual builds and different ergonomic analyses on a product. Digital Human Modelling (DHM) software packages such as Siemens Jack, often integrate with CAD systems to provide a virtual environment which allows investigation of operator and product compatibility. Although the integration between DHM and CAD systems allows for the ergonomic analysis of anthropometric design, human musculoskeletal, multi-body modelling software packages such as the AnyBody Modelling System (AMS) are required to support physiologic design. They provide muscular force analysis, estimate human musculoskeletal strain and help address human comfort assessment. However, the independent characteristics of the modelling systems Jack and AMS constrain engineers and ergonomists in conducting a complete ergonomic analysis. AMS is a stand alone programming system without a capability to integrate into CAD environments. Jack is providing CAD integrated human-in-the-loop capability, but without considering musculoskeletal activity. Consequently, engineers and ergonomists need to perform many redundant tasks during product and process design. Besides, the existing biomechanical model in AMS uses a simplified estimation of body proportions, based on a segment mass ratio derived scaling approach. This is insufficient to represent user populations anthropometrically correct in AMS. In addition, sub-models are derived from different sources of morphologic data and are therefore anthropometrically inconsistent. Therefore, an interface between the biomechanical AMS and the virtual human model Jack was developed to integrate a musculoskeletal simulation with Jack posture modeling. This interface provides direct data exchange between the two man-models, based on a consistent data structure and common body model. The study assesses kinematic and biomechanical model characteristics of Jack and AMS, and defines an appropriate biomechanical model. The information content for interfacing the two systems is defined and a protocol is identified. The interface program is developed and implemented through Tcl and Jack-script(Python), and interacts with the AMS console application to operate AMS procedures.
Resumo:
Digital human modeling (DHM), as a convenient and cost-effective tool, is increasingly incorporated into product and workplace design. In product design, it is predominantly used for the development of driver-vehicle systems. Most digital human modeling software tools, such as JACK, RAMSIS and DELMIA HUMANBUILDER provide functions to predict posture and positions for drivers with selected anthropometry according to SAE (Society of Automotive Engineers) Recommended Practices and other ergonomics guidelines. However, few studies have presented 2nd row passenger postural information, and digital human modeling of these passenger postures cannot be performed directly using the existing driver posture prediction functions. In this paper, the significant studies related to occupant posture and modeling were reviewed and a framework of determinants of driver vs. 2nd row occupant posture modeling was extracted. The determinants which are regarded as input factors for posture modeling include target population anthropometry, vehicle package geometry and seat design variables as well as task definitions. The differences between determinants of driver and 2nd row occupant posture models are significant, as driver posture modeling is primarily based on the position of the foot on the accelerator pedal (accelerator actuation point AAP, accelerator heel point AHP) and the hands on the steering wheel (steering wheel centre point A-Point). The objectives of this paper are aimed to investigate those differences between driver and passenger posture, and to supplement the existing parametric model for occupant posture prediction. With the guide of the framework, the associated input parameters of occupant digital human models of both driver and second row occupant will be identified. Beyond the existing occupant posture models, for example a driver posture model could be modified to predict second row occupant posture, by adjusting the associated input parameters introduced in this paper. This study combines results from a literature review and the theoretical modeling stage of a second row passenger posture prediction model project.
Resumo:
In the era of Web 2.0, huge volumes of consumer reviews are posted to the Internet every day. Manual approaches to detecting and analyzing fake reviews (i.e., spam) are not practical due to the problem of information overload. However, the design and development of automated methods of detecting fake reviews is a challenging research problem. The main reason is that fake reviews are specifically composed to mislead readers, so they may appear the same as legitimate reviews (i.e., ham). As a result, discriminatory features that would enable individual reviews to be classified as spam or ham may not be available. Guided by the design science research methodology, the main contribution of this study is the design and instantiation of novel computational models for detecting fake reviews. In particular, a novel text mining model is developed and integrated into a semantic language model for the detection of untruthful reviews. The models are then evaluated based on a real-world dataset collected from amazon.com. The results of our experiments confirm that the proposed models outperform other well-known baseline models in detecting fake reviews. To the best of our knowledge, the work discussed in this article represents the first successful attempt to apply text mining methods and semantic language models to the detection of fake consumer reviews. A managerial implication of our research is that firms can apply our design artifacts to monitor online consumer reviews to develop effective marketing or product design strategies based on genuine consumer feedback posted to the Internet.
Resumo:
Background: Access to cardiac services is essential for appropriate implementation of evidence-based therapies to improve outcomes. The Cardiac Accessibility and Remoteness Index for Australia (Cardiac ARIA) aimed to derive an objective, geographic measure reflecting access to cardiac services. Methods: An expert panel defined an evidence-based clinical pathway. Using Geographic Information Systems (GIS), a numeric/alpha index was developed at two points along the continuum of care. The acute category (numeric) measured the time from the emergency call to arrival at an appropriate medical facility via road ambulance. The aftercare category (alpha) measured access to four basic services (family doctor, pharmacy, cardiac rehabilitation, and pathology services) when a patient returned to their community. Results: The numeric index ranged from 1 (access to principle referral center with cardiac catheterization service ≤ 1 hour) to 8 (no ambulance service, > 3 hours to medical facility, air transport required). The alphabetic index ranged from A (all 4 services available within 1 hour drive-time) to E (no services available within 1 hour). 13.9 million (71%) Australians resided within Cardiac ARIA 1A locations (hospital with cardiac catheterization laboratory and all aftercare within 1 hour). Those outside Cardiac 1A were over-represented by people aged over 65 years (32%) and Indigenous people (60%). Conclusion: The Cardiac ARIA index demonstrated substantial inequity in access to cardiac services in Australia. This methodology can be used to inform cardiology health service planning and the methodology could be applied to other common disease states within other regions of the world.
Resumo:
Post-deployment maintenance and evolution can account for up to 75% of the cost of developing a software system. Software refactoring can reduce the costs associated with evolution by improving system quality. Although refactoring can yield benefits, the process includes potentially complex, error-prone, tedious and time-consuming tasks. It is these tasks that automated refactoring tools seek to address. However, although the refactoring process is well-defined, current refactoring tools do not support the full process. To develop better automated refactoring support, we have completed a usability study of software refactoring tools. In the study, we analysed the task of software refactoring using the ISO 9241-11 usability standard and Fitts' List of task allocation. Expanding on this analysis, we reviewed 11 collections of usability guidelines and combined these into a single list of 38 guidelines. From this list, we developed 81 usability requirements for refactoring tools. Using these requirements, the software refactoring tools Eclipse 3.2, Condenser 1.05, RefactorIT 2.5.1, and Eclipse 3.2 with the Simian UI 2.2.12 plugin were studied. Based on the analysis, we have selected a subset of the requirements that can be incorporated into a prototype refactoring tool intended to address the full refactoring process.
Resumo:
During the course of several natural disasters in recent years, Twitter has been found to play an important role as an additional medium for many–to–many crisis communication. Emergency services are successfully using Twitter to inform the public about current developments, and are increasingly also attempting to source first–hand situational information from Twitter feeds (such as relevant hashtags). The further study of the uses of Twitter during natural disasters relies on the development of flexible and reliable research infrastructure for tracking and analysing Twitter feeds at scale and in close to real time, however. This article outlines two approaches to the development of such infrastructure: one which builds on the readily available open source platform yourTwapperkeeper to provide a low–cost, simple, and basic solution; and, one which establishes a more powerful and flexible framework by drawing on highly scaleable, state–of–the–art technology.
Resumo:
This study investigated potential palaeoclimate proxies provided by rare earth element (REE) geochemistry in speleothems and in clay mineralogy of cave sediments. Speleothem and sediment samples were collected from a series of cave fill deposits that occurred with rich vertebrate fossil assemblages in and around Mount Etna National Park, Rockhampton (central coastal Queensland). The fossil deposits range from Plio- Pleistocene to Holocene in age (based on uranium/thorium dating) and appear to represent depositional environments ranging from enclosed rainforest to semi-arid grasslands. Therefore, the Mount Etna cave deposits offer the perfect opportunity to test new palaeoclimate tools as they include deposits that span a known significant climate shift on the basis of independent faunal data. The first section of this study investigates the REE distribution of the host limestone to provide baseline geochemistry for subsequent speleothem investigations. The Devonian Mount Etna Beds were found to be more complex than previous literature had documented. The studied limestone massif is overturned, highly recrystallised in parts and consists of numerous allochthonous blocks with different spatial orientations. Despite the complex geologic history of the Mount Etna Beds, Devonian seawater-like REE patterns were recovered in some parts of the limestone and baseline geochemistry was determined for the bulk limestone for comparison with speleothem REE patterns. The second part of the study focused on REE distribution in the karst system and the palaeoclimatic implications of such records. It was found that REEs have a high affinity for calcite surfaces and that REE distributions in speleothems vary between growth bands much more than along growth bands, thus providing a temporal record that may relate to environmental changes. The morphology of different speleothems (i.e., stalactites, stalagmites, and flowstones) has little bearing on REE distributions provided they are not contaminated with particulate fines. Thus, baseline knowledge developed in the study suggested that speleothems were basically comparable for assessing palaeoclimatically controlled variations in REE distributions. Speleothems from rainforest and semi-arid phases were compared and it was found that there are definable differences in REE distribution that can be attributed to climate. In particular during semiarid phases, total REE concentration decreased, LREE became more depleted, Y/Ho increased, La anomalies were more positive and Ce anomalies were more negative. This may reflect more soil development during rainforest phases and more organic particles and colloids, which are known to transport REEs, in karst waters. However, on a finer temporal scale (i.e. growth bands) within speleothems from the same climate regime, no difference was seen. It is suggested that this may be due to inadequate time for soil development changes on the time frames represented by differences in growth band density. The third part of the study was a reconnaissance investigation focused on mineralogy of clay cave sediments, illite/kaolinite ratios in particular, and the potential palaeoclimatic implications of such records. Although the sample distribution was not optimal, the preliminary results suggest that the illite/kaolinite ratio increased during cold and dry intervals, consistent with decreased chemical weathering during those times. The study provides a basic framework for future studies at differing latitudes to further constrain the parameters of the proxy. The identification of such a proxy recorded in cave sediment has broad implications as clay ratios could potentially provide a basic local climate proxy in the absence of fossil faunas and speleothem material. This study suggests that REEs distributed in speleothems may provide information about water throughput and soil formation, thus providing a potential palaeoclimate proxy. It highlights the importance of understanding the host limestone geochemistry and broadens the distribution and potential number of cave field sites as palaeoclimate information no longer relies solely on the presence of fossil faunas and or speleothems. However, additional research is required to better understand the temporal scales required for the proxies to be recognised.
Resumo:
Purpose – The rapidly changing role of capital city airports has placed demands on surrounding infrastructure. The need for infrastructure management and coordination is increasing as airports and cities grow and share common infrastructure frameworks. The purpose of this paper is to document the changing context in Australia, where the privatisation of airports has stimulated considerable land development with resulting pressures on surrounding infrastructure provision. It aims to describe a tool that is being developed to support decision-making between various stakeholders in the airport region. The use of planning support systems improves both communication and data transfer between stakeholders and provides a foundation for complex decisions on infrastructure. Design/methodology/approach – The research uses a case study approach and focuses on Brisbane International Airport and Brisbane City Council. The research is primarily descriptive and provides an empirical assessment of the challenges of developing and implementing planning support systems as a tool for governance and decision-making. Findings – The research assesses the challenges in implementing a common data platform for stakeholders. Agency data platforms and models, traditional roles in infrastructure planning, and integrating similar data platforms all provide barriers to sharing a common language. The use of a decision support system has to be shared by all stakeholders with a common platform that can be versatile enough to support scenarios and changing conditions. The use of iPadss for scenario modelling provides stakeholders the opportunity to interact, compare scenarios and views, and react with the modellers to explore other options. Originality/value – The research confirms that planning support systems have to be accessible and interactive by their users. The Airport City concept is a new and evolving focus for airport development and will place continuing pressure on infrastructure servicing. A coordinated and efficient approach to infrastructure decision-making is critical, and an interactive planning support system that can model infrastructure scenarios provides a sound tool for governance.
Resumo:
The work presented in this poster outlines the steps taken to model a 4 mm conical collimator (BrainLab, Germany) on a Novalis Tx linear accelerator (Varian, Palo Alto, USA) capable of producing a 6MV photon beam for treatment of Stereotactic Radiosurgery (SRS) patients. The verification of this model was performed by measurements in liquid water and in virtual water. The measurements involved scanning depth dose and profiles in a water tank plus measurement of output factors in virtual water using Gafchromic® EBT3 film.
Resumo:
An increase in the likelihood of navigational collisions in port waters has put focus on the collision avoidance process in port traffic safety. The most widely used on-board collision avoidance system is the automatic radar plotting aid which is a passive warning system that triggers an alert based on the pilot’s pre-defined indicators of distance and time proximities at the closest point of approaches in encounters with nearby vessels. To better help pilot in decision making in close quarter situations, collision risk should be considered as a continuous monotonic function of the proximities and risk perception should be considered probabilistically. This paper derives an ordered probit regression model to study perceived collision risks. To illustrate the procedure, the risks perceived by Singapore port pilots were obtained to calibrate the regression model. The results demonstrate that a framework based on the probabilistic risk assessment model can be used to give a better understanding of collision risk and to define a more appropriate level of evasive actions.
Resumo:
Navigational collisions are one of the major safety concerns in many seaports. Despite the extent of recent works done on port navigational safety research, little is known about harbor pilot’s perception of collision risks in port fairways. This paper uses a hierarchical ordered probit model to investigate associations between perceived risks and the geometric and traffic characteristics of fairways and the pilot attributes. Perceived risk data, collected through a risk perception survey conducted among the Singapore port pilots, are used to calibrate the model. Intra-class correlation coefficient justifies use of the hierarchical model in comparison with an ordinary model. Results show higher perceived risks in fairways attached to anchorages, and in those featuring sharper bends and higher traffic operating speeds. Lesser risks are perceived in fairways attached to shoreline and confined waters, and in those with one-way traffic, traffic separation scheme, cardinal marks and isolated danger marks. Risk is also found to be perceived higher in night.
Resumo:
Advances in safety research—trying to improve the collective understanding of motor vehicle crash causes and contributing factors—rest upon the pursuit of numerous lines of research inquiry. The research community has focused considerable attention on analytical methods development (negative binomial models, simultaneous equations, etc.), on better experimental designs (before-after studies, comparison sites, etc.), on improving exposure measures, and on model specification improvements (additive terms, non-linear relations, etc.). One might logically seek to know which lines of inquiry might provide the most significant improvements in understanding crash causation and/or prediction. It is the contention of this paper that the exclusion of important variables (causal or surrogate measures of causal variables) cause omitted variable bias in model estimation and is an important and neglected line of inquiry in safety research. In particular, spatially related variables are often difficult to collect and omitted from crash models—but offer significant opportunities to better understand contributing factors and/or causes of crashes. This study examines the role of important variables (other than Average Annual Daily Traffic (AADT)) that are generally omitted from intersection crash prediction models. In addition to the geometric and traffic regulatory information of intersection, the proposed model includes many spatial factors such as local influences of weather, sun glare, proximity to drinking establishments, and proximity to schools—representing a mix of potential environmental and human factors that are theoretically important, but rarely used. Results suggest that these variables in addition to AADT have significant explanatory power, and their exclusion leads to omitted variable bias. Provided is evidence that variable exclusion overstates the effect of minor road AADT by as much as 40% and major road AADT by 14%.
Resumo:
Many corporations and individuals realize that environmental sustainability is an urgent problem to address. In this chapter, we contribute to the emerging academic discussion by proposing two innovative approaches for engaging in the development of environmentally sustainable business processes. Specifically, we describe an extended process modeling approach for capturing and documenting the dioxide emissions produced during the execution of a business process. For illustration, we apply this approach to the case of a government Shared Service provider. Second, we then introduce an analysis method for measuring the carbon dioxide emissions produced during the execution of a business process. To illustrate this approach, we apply it in the real-life case of a European airport and show how this information can be leveraged in the re-design of "green" business processes.