132 resultados para Model Driven Architecture (MDA)
Resumo:
Recent advances in the area of ‘Transformational Government’ position the citizen at the centre of focus. This paradigm shift from a department-centric to a citizen-centric focus requires governments to re-think their approach to service delivery, thereby decreasing costs and increasing citizen satisfaction. The introduction of franchises as a virtual business layer between the departments and their citizens is intended to provide a solution. Franchises are structured to address the needs of citizens independent of internal departmental structures. For delivering services online, governments pursue the development of a One-Stop Portal, which structures information and services through those franchises. Thus, each franchise can be mapped to a specific service bundle, which groups together services that are deemed to be of relevance to a specific citizen need. This study focuses on the development and evaluation of these service bundles. In particular, two research questions guide the line of investigation of this study: Research Question 1): What methods can be used by governments to identify service bundles as part of governmental One-Stop Portals? Research Question 2): How can the quality of service bundles in governmental One-Stop Portals be evaluated? The first research question asks about the identification of suitable service bundle identification methods. A literature review was conducted, to, initially, conceptualise the service bundling task, in general. As a consequence, a 4-layer model of service bundling and a morphological box were created, detailing characteristics that are of relevance when identifying service bundles. Furthermore, a literature review of Decision-Support Systems was conducted to identify approaches of relevance in different bundling scenarios. These initial findings were complemented by targeted studies of multiple leading governments in the e-government domain, as well as with a local expert in the field. Here, the aim was to identify the current status of online service delivery and service bundling in practice. These findings led to the conceptualising of two service bundle identification methods, applicable in the context of Queensland Government: On the one hand, a provider-driven approach, based on service description languages, attributes, and relationships between services was conceptualised. As well, a citizen-driven approach, based on analysing the outcomes from content identification and grouping workshops with citizens, was also conceptualised. Both methods were then applied and evaluated in practice. The conceptualisation of the provider-driven method for service bundling required the initial specification of relevant attributes that could be used to identify similarities between services called relationships; these relationships then formed the basis for the identification of service bundles. This study conceptualised and defined seven relationships, namely ‘Co-location’, ‘Resource’, ‘Co-occurrence’, ‘Event’, ‘Consumer’, ‘Provider’, and ‘Type’. The relationships, and the bundling method itself, were applied and refined as part of six Action Research cycles in collaboration with the Queensland Government. The findings show that attributes and relationships can be used effectively as a means for bundle identification, if distinct decision rules are in place to prescribe how services are to be identified. For the conceptualisation of the citizen-driven method, insights from the case studies led to the decision to involve citizens, through card sorting activities. Based on an initial list of services, relevant for a certain franchise, participating citizens grouped services according to their liking. The card sorting activity, as well as the required analysis and aggregation of the individual card sorting results, was analysed in depth as part of this study. A framework was developed that can be used as a decision-support tool to assist with the decision of what card sorting analysis method should be utilised in a given scenario. The characteristic features associated with card sorting in a government context led to the decision to utilise statistical analysis approaches, such as cluster analysis and factor analysis, to aggregate card sorting results. The second research question asks how the quality of service bundles can be assessed. An extensive literature review was conducted focussing on bundle, portal, and e-service quality. It was found that different studies use different constructs, terminology, and units of analysis, which makes comparing these models a difficult task. As a direct result, a framework was conceptualised, that can be used to position past and future studies in this research domain. Complementing the literature review, interviews conducted as part of the case studies with leaders in e-government, indicated that, typically, satisfaction is evaluated for the overall portal once the portal is online, but quality tests are not conducted during the development phase. Consequently, a research model which appropriately defines perceived service bundle quality would need to be developed from scratch. Based on existing theory, such as Theory of Reasoned Action, Expectation Confirmation Theory, and Theory of Affordances, perceived service bundle quality was defined as an inferential belief. Perceived service bundle quality was positioned within the nomological net of services. Based on the literature analysis on quality, and on the subsequent work of a focus group, the hypothesised antecedents (descriptive beliefs) of the construct and the associated question items were defined and the research model conceptualised. The model was then tested, refined, and finally validated during six Action Research cycles. Results show no significant difference in higher quality or higher satisfaction among users for either the provider-driven method or for the citizen-driven method. The decision on which method to choose, it was found, should be based on contextual factors, such as objectives, resources, and the need for visibility. The constructs of the bundle quality model were examined. While the quality of bundles identified through the citizen-centric approach could be explained through the constructs ‘Navigation’, ‘Ease of Understanding’, and ‘Organisation’, bundles identified through the provider-driven approach could be explained solely through the constructs ‘Navigation’ and ‘Ease of Understanding’. An active labelling style for bundles, as part of the provider-driven Information Architecture, had a larger impact on ‘Quality’ than the topical labelling style used in the citizen-centric Information Architecture. However, ‘Organisation’, reflecting the internal, logical structure of the Information Architecture, was a significant factor impacting on ‘Quality’ only in the citizen-driven Information Architecture. Hence, it was concluded that active labelling can compensate for a lack of logical structure. Further studies are needed to further test this conjecture. Such studies may involve building alternative models and conducting additional empirical research (e.g. use of an active labelling style for the citizen-driven Information Architecture). This thesis contributes to the body of knowledge in several ways. Firstly, it presents an empirically validated model of the factors explaining and predicting a citizen’s perception of service bundle quality. Secondly, it provides two alternative methods that can be used by governments to identify service bundles in structuring the content of a One-Stop Portal. Thirdly, this thesis provides a detailed narrative to suggest how the recent paradigm shift in the public domain, towards a citizen-centric focus, can be pursued by governments; the research methodology followed by this study can serve as an exemplar for governments seeking to achieve a citizen-centric approach to service delivery.
Resumo:
In this study, a hierarchical nano/microfibrous chitosan/collagen scaffold that approximates structural and functional attributes of native extracellular matrix (ECM), has been developed for applicability in skin tissue engineering. Scaffolds were produced by electrospinning of chitosan followed by imbibing of collagen solution, freeze-drying and subsequent cross-linking of two polymers. Scanning electron microscopy showed formation of layered scaffolds with nano/microfibrous architechture. Physico-chemical properties of scaffolds including tensile strength, swelling behavior and biodegradability were found satisfactory for intended application. 3T3 fibroblasts and HaCaT keratinocytes showed good in vitro cellular response on scaffolds thereby indicating the matrices′ cytocompatible nature. Scaffolds tested in an ex vivo human skin equivalent (HSE) wound model, as a preliminary alternative to animal testing, showed keratinocyte migration and wound re-epithelization — a pre-requisite for healing and regeneration. Taken together, the herein proposed chitosan/collagen scaffold, shows good potential for skin tissue engineering.
Resumo:
This thesis takes a new data mining approach for analyzing road/crash data by developing models for the whole road network and generating a crash risk profile. Roads with an elevated crash risk due to road surface friction deficit are identified. The regression tree model, predicting road segment crash rate, is applied in a novel deployment coined regression tree extrapolation that produces a skid resistance/crash rate curve. Using extrapolation allows the method to be applied across the network and cope with the high proportion of missing road surface friction values. This risk profiling method can be applied in other domains.
Resumo:
Establishing a persistent presence in the ocean with an autonomous underwater vehicle (AUV) capable of observing temporal variability of large-scale ocean processes requires a unique sensor platform. In this paper, we examine the utility of vehicles that can only control their depth in the water column for such extended deployments. We present a strategy that utilizes ocean model predictions to facilitate a basic level of autonomy and enables general control for these profiling floats. The proposed method is based on experimentally validated techniques for utilizing ocean current models to control autonomous gliders. With the appropriate vertical actuation, and utilizing spatio–temporal variations in water speed and direction, we show that general controllability results can be met. First, we apply an A* planner to a local controllability map generated from predictions of ocean currents. This computes a path between start and goal waypoints that has the highest likelihood of successful execution. A computed depth plan is generated with a model-predictive controller (MPC), and selects the depths for the vehicle so that ambient currents guide it toward the goal. Mission constraints are included to simulate and motivate a practical data collection mission. Results are presented in simulation for a mission off the coast of Los Angeles, CA, USA, that show encouraging results in the ability of a drifting vehicle to reach a desired location.
Resumo:
Road surface skid resistance has been shown to have a strong relationship to road crash risk, however, applying the current method of using investigatory levels to identify crash prone roads is problematic as they may fail in identifying risky roads outside of the norm. The proposed method analyses a complex and formerly impenetrable volume of data from roads and crashes using data mining. This method rapidly identifies roads with elevated crash-rate, potentially due to skid resistance deficit, for investigation. A hypothetical skid resistance/crash risk curve is developed for each road segment, driven by the model deployed in a novel regression tree extrapolation method. The method potentially solves the problem of missing skid resistance values which occurs during network-wide crash analysis, and allows risk assessment of the major proportion of roads without skid resistance values.
Resumo:
Pesticides used in agricultural systems must be applied in economically viable and environmentally sensitive ways, and this often requires expensive field trials on spray deposition and retention by plant foliage. Computational models to describe whether a spray droplet sticks (adheres), bounces or shatters on impact, and if any rebounding parent or shatter daughter droplets are recaptured, would provide an estimate of spray retention and thereby act as a useful guide prior to any field trials. Parameter-driven interactive software has been implemented to enable the end-user to study and visualise droplet interception and impaction on a single, horizontal leaf. Living chenopodium, wheat and cotton leaves have been scanned to capture the surface topography and realistic virtual leaf surface models have been generated. Individual leaf models have then been subjected to virtual spray droplets and predictions made of droplet interception with the virtual plant leaf. Thereafter, the impaction behaviour of the droplets and the subsequent behaviour of any daughter droplets, up until re-capture, are simulated to give the predicted total spray retention by the leaf. A series of critical thresholds for the stick, bounce, and shatter elements in the impaction process have been developed for different combinations of formulation, droplet size and velocity, and leaf surface characteristics to provide this output. The results show that droplet properties, spray formulations and leaf surface characteristics all influence the predicted amount of spray retained on a horizontal leaf surface. Overall the predicted spray retention increases as formulation surface tension, static contact angle, droplet size and velocity decreases. Predicted retention on cotton is much higher than on chenopodium. The average predicted retention on a single horizontal leaf across all droplet size, velocity and formulations scenarios tested, is 18, 30 and 85% for chenopodium, wheat and cotton, respectively.
Resumo:
Companies require new strategies to drive growth and survival, as the fast pace of change has created the need for greater business flexibility. Therefore, industry leaders are looking to business innovation as a principle source of differentiation and competitive advantage. However, most companies rely heavily on either technology or products to provide business innovation, yet competitors can easily and rapidly surpass these forms of innovation. Business model innovation expands beyond innovation in isolated areas, such as product innovation, to create strategies that incorporate many business avenues to work together to create and deliver value to its customers. Existing literature highlights that a business model’s central role is ‘customer value’. However, the emotional underpinnings of customer value within a business model are not well understood. The integration of customer emotion into business model design and value chain can be viewed as a way to innovate beyond just products, services and processes. This paper investigates the emotional avenues within business strategy and operations, business model innovation and customer engagement. Three propositions are outlined and explored as future research. The significance of this research is to provide companies with a new approach to innovation through a deeper understanding and integration of their customers’ emotions.
Resumo:
Recent road safety statistics show that the decades-long fatalities decreasing trend is stopping and stagnating. Statistics further show that crashes are mostly driven by human error, compared to other factors such as environmental conditions and mechanical defects. Within human error, the dominant error source is perceptive errors, which represent about 50% of the total. The next two sources are interpretation and evaluation, which accounts together with perception for more than 75% of human error related crashes. Those statistics show that allowing drivers to perceive and understand their environment better, or supplement them when they are clearly at fault, is a solution to a good assessment of road risk, and, as a consequence, further decreasing fatalities. To answer this problem, currently deployed driving assistance systems combine more and more information from diverse sources (sensors) to enhance the driver's perception of their environment. However, because of inherent limitations in range and field of view, these systems' perception of their environment remains largely limited to a small interest zone around a single vehicle. Such limitations can be overcomed by increasing the interest zone through a cooperative process. Cooperative Systems (CS), a specific subset of Intelligent Transportation Systems (ITS), aim at compensating for local systems' limitations by associating embedded information technology and intervehicular communication technology (IVC). With CS, information sources are not limited to a single vehicle anymore. From this distribution arises the concept of extended or augmented perception. Augmented perception allows extending an actor's perceptive horizon beyond its "natural" limits not only by fusing information from multiple in-vehicle sensors but also information obtained from remote sensors. The end result of an augmented perception and data fusion chain is known as an augmented map. It is a repository where any relevant information about objects in the environment, and the environment itself, can be stored in a layered architecture. This thesis aims at demonstrating that augmented perception has better performance than noncooperative approaches, and that it can be used to successfully identify road risk. We found it was necessary to evaluate the performance of augmented perception, in order to obtain a better knowledge on their limitations. Indeed, while many promising results have already been obtained, the feasibility of building an augmented map from exchanged local perception information and, then, using this information beneficially for road users, has not been thoroughly assessed yet. The limitations of augmented perception, and underlying technologies, have not be thoroughly assessed yet. Most notably, many questions remain unanswered as to the IVC performance and their ability to deliver appropriate quality of service to support life-saving critical systems. This is especially true as the road environment is a complex, highly variable setting where many sources of imperfections and errors exist, not only limited to IVC. We provide at first a discussion on these limitations and a performance model built to incorporate them, created from empirical data collected on test tracks. Our results are more pessimistic than existing literature, suggesting IVC limitations have been underestimated. Then, we develop a new CS-applications simulation architecture. This architecture is used to obtain new results on the safety benefits of a cooperative safety application (EEBL), and then to support further study on augmented perception. At first, we confirm earlier results in terms of crashes numbers decrease, but raise doubts on benefits in terms of crashes' severity. In the next step, we implement an augmented perception architecture tasked with creating an augmented map. Our approach is aimed at providing a generalist architecture that can use many different types of sensors to create the map, and which is not limited to any specific application. The data association problem is tackled with an MHT approach based on the Belief Theory. Then, augmented and single-vehicle perceptions are compared in a reference driving scenario for risk assessment,taking into account the IVC limitations obtained earlier; we show their impact on the augmented map's performance. Our results show that augmented perception performs better than non-cooperative approaches, allowing to almost tripling the advance warning time before a crash. IVC limitations appear to have no significant effect on the previous performance, although this might be valid only for our specific scenario. Eventually, we propose a new approach using augmented perception to identify road risk through a surrogate: near-miss events. A CS-based approach is designed and validated to detect near-miss events, and then compared to a non-cooperative approach based on vehicles equiped with local sensors only. The cooperative approach shows a significant improvement in the number of events that can be detected, especially at the higher rates of system's deployment.
Resumo:
INTRODUCTION There is evidence that the reduction of blood perfusion caused by closed soft tissue trauma (CSTT) delays the healing of the affected soft tissues and bone [1]. We hypothesise that the characterisation of vascular morphology changes (VMC) following injury allows us to determine the effect of the injury on tissue perfusion and thereby the severity of the injury. This research therefore aims to assess the VMC following CSTT in a rat model using contrast-enhanced micro-CT imaging. METHODOLOGY A reproducible CSTT was created on the left leg of anaesthetized rats (male, 12 weeks) with an impact device. After euthanizing the animals at 6 and 24 hours following trauma, the vasculature was perfused with a contrast agent (Microfil, Flowtech, USA). Both hind-limbs were dissected and imaged using micro-CT for qualitative comparison of the vascular morphology and quantification of the total vascular volume (VV). In addition, biopsy samples were taken from the CSTT region and scanned to compare morphological parameters of the vasculature between the injured and control limbs. RESULTS AND DISCUSSION While the visual observation of the hindlimb scans showed consistent perfusion of the microvasculature with microfil, enabling the identification of all major blood vessels, no clear differences in the vascular architecture were observed between injured and control limbs. However, overall VV within the region of interest (ROI)was measured to be higher for the injured limbs after 24h. Also, scans of biopsy samples demonstrated that vessel diameter and density were higher in the injured legs 24h after impact. CONCLUSION We believe these results will contribute to the development of objective diagnostic methods for CSTT based on changes to the microvascular morphology as well as aiding in the validation of future non-invasive clinical assessment modalities.
Resumo:
This paper investigates how Enterprise Architecture (EA) evolves due to emerging trends. It specifically explores how EA integrates the Service-oriented Architecture (SOA). Archer’s Morphogenetic theory is used as an analytical approach to distinguish the architectural conditions under which SOA is introduced, to study the relationships between these conditions and SOA introduction, and to reflect on EA evolution (elaborations) that then take place. The paper focuses on reasons for why EA evolution could take place, or not and what architectural changes could happen due to SOA integration. The research builds on sound theoretical foundations to discuss EA evolution in a field that often lacks a solid theoretical groundwork. Specifically, it proposes that critical realism, using the morphogenetic theory, can provide a useful theoretical foundation to study enterprise architecture (EA) evolution. The initial results of a literature review (a-priori model) were extended using explorative interviews. The findings of this study are threefold. First, there are five different levels of EA-SOA integration outcomes. Second, a mature EA, flexible and well-defined EA framework and comprehensive objectives of EA improve the integration outcomes. Third, the analytical separation using Archer’s theory is helpful in order to understand how these different integration outcomes are generated.
Resumo:
Letting the patron choose ebooks has been a successful experience. Why not apply the same purchase model to other formats? This showcase outlines Queensland University of Technology’s experience with a trial of patron driven acquisition (PDA) for online video. The trial commencing in August 2012 provided access to over 700 online videos licensed from Kanopy across a number of discipline areas. As online video publishing is still in the early stages of development, and as the trial is only in the very early stages, it is too early to draw any firm conclusions about the likely suitability of this model for online video selection and acquisition. However, the trial provides some interesting initial comparisons with ebook PDA and existing online video purchase models and prompts further consideration of PDA as a method for online video selection and licensing.
Resumo:
Tissue engineering focuses on the repair and regeneration of tissues through the use of biodegradable scaffold systems that structurally support regions of injury whilst recruiting and/or stimulating cell populations to rebuild the target tissue. Within bone tissue engineering, the effects of scaffold architecture on cellular response have not been conclusively characterized in a controlled-density environment. We present a theoretical and practical assessment of the effects of polycaprolactone (PCL) scaffold architectural modifications on mechanical and flow characteristics as well as MC3T3-E1 preosteoblast cellular response in an in vitro static plate and custom-designed perfusion bioreactor model. Four scaffold architectures were contrasted, which varied in inter-layer lay-down angle and offset between layers, whilst maintaining a structural porosity of 60 ± 5%. We established that as layer angle was decreased (90° vs. 60°) and offset was introduced (0 vs. 0.5 between layers), structural stiffness, yield stress, strength, pore size and permeability decreased, whilst computational fluid dynamics-modeled wall shear stress was increased. Most significant effects were noted with layer offset. Seeding efficiencies in static culture were also dramatically increased due to offset (~45% to ~86%), with static culture exhibiting a much higher seeding efficiency than perfusion culture. Scaffold architecture had minimal effect on cell response in static culture. However, architecture influenced osteogenic differentiation in perfusion culture, likely by modifying the microfluidic environment.
Resumo:
An Artificial Neural Network (ANN) is a computational modeling tool which has found extensive acceptance in many disciplines for modeling complex real world problems. An ANN can model problems through learning by example, rather than by fully understanding the detailed characteristics and physics of the system. In the present study, the accuracy and predictive power of an ANN was evaluated in predicting kinetic viscosity of biodiesels over a wide range of temperatures typically encountered in diesel engine operation. In this model, temperature and chemical composition of biodiesel were used as input variables. In order to obtain the necessary data for model development, the chemical composition and temperature dependent fuel properties of ten different types of biodiesels were measured experimentally using laboratory standard testing equipments following internationally recognized testing procedures. The Neural Networks Toolbox of MatLab R2012a software was used to train, validate and simulate the ANN model on a personal computer. The network architecture was optimised following a trial and error method to obtain the best prediction of the kinematic viscosity. The predictive performance of the model was determined by calculating the absolute fraction of variance (R2), root mean squared (RMS) and maximum average error percentage (MAEP) between predicted and experimental results. This study found that ANN is highly accurate in predicting the viscosity of biodiesel and demonstrates the ability of the ANN model to find a meaningful relationship between biodiesel chemical composition and fuel properties at different temperature levels. Therefore the model developed in this study can be a useful tool in accurately predict biodiesel fuel properties instead of undertaking costly and time consuming experimental tests.
Resumo:
In this study, the mixed convection heat transfer and fluid flow behaviors in a lid-driven square cavity filled with high Prandtl number fluid (Pr = 5400, ν = 1.2×10-4 m2/s) at low Reynolds number is studied using thermal Lattice Boltzmann method (TLBM) where ν is the viscosity of the fluid. The LBM has built up on the D2Q9 model and the single relaxation time method called the Lattice-BGK (Bhatnagar-Gross-Krook) model. The effects of the variations of non dimensional mixed convection parameter called Richardson number(Ri) with and without heat generating source on the thermal and flow behavior of the fluid inside the cavity are investigated. The results are presented as velocity and temperature profiles as well as stream function and temperature contours for Ri ranging from 0.1 to 5.0 with other controlling parameters that present in this study. It is found that LBM has good potential to simulate mixed convection heat transfer and fluid flow problem. Finally the simulation results have been compared with the previous numerical and experimental results and it is found to be in good agreement.
Resumo:
Process models provide companies efficient means for managing their business processes. Tasks where process models are employed are different by nature and require models of various abstraction levels. However, maintaining several models of one business process involves a lot of synchronization effort and is erroneous. Business process model abstraction assumes a detailed model of a process to be available and derives coarse grained models from it. The task of abstraction is to tell significant model elements from insignificant ones and to reduce the latter. In this paper we argue that process model abstraction can be driven by different abstraction criteria. Criterion choice depends on a task which abstraction facilitates. We propose an abstraction slider - a mechanism that allows user control of the model abstraction level. We discuss examples of combining the slider with different abstraction criteria and sets of process model transformation rules.