227 resultados para FITTING IDEALS
Resumo:
Anatomically pre-contoured fracture fixation plates are a treatment option for bone fractures. A well-fitting plate can be used as a tool for anatomical reduction of the fractured bone. However, recent studies showed that some plates fit poorly for many patients due to considerable shape variations between bones of the same anatomical site. Therefore, the plates have to be manually fitted and deformed by surgeons to fit each patient optimally. The process is time-intensive and labor-intensive, and could lead to adverse clinical implications such as wound infection or plate failure. This paper proposes a new iterative method to simulate the patient-specific deformation of an optimally fitting plate for pre-operative planning purposes. We further demonstrate the validation of the method through a case study. The proposed method involves the integration of four commercially available software tools, Matlab, Rapidform2006, SolidWorks, and ANSYS, each performing specific tasks to obtain a plate shape that fits optimally for an individual tibia and is mechanically safe. A typical challenge when crossing multiple platforms is to ensure correct data transfer. We present an example of the implementation of the proposed method to demonstrate successful data transfer between the four platforms and the feasibility of the method.
Resumo:
A mathematical model is developed for the ripening of cheese. Such models may assist predicting final cheese quality using measured initial composition. The main constituent chemical reactions are described with ordinary differential equations. Numerical solutions to the model equations are found using Matlab. Unknown parameter values have been fitted using experimental data available in the literature. The results from the numerical fitting are in good agreement with the data. Statistical analysis is performed on near infrared data provided to the MISG. However, due to the inhomogeneity and limited nature of the data, not many conclusions can be drawn from the analysis. A simple model of the potential changes in acidity of cheese is also considered. The results from this model are consistent with cheese manufacturing knowledge, in that the pH of cheddar cheese does not significantly change during ripening.
Resumo:
This paper presents a technique for the automated removal of noise from process execution logs. Noise is the result of data quality issues such as logging errors and manifests itself in the form of infrequent process behavior. The proposed technique generates an abstract representation of an event log as an automaton capturing the direct follows relations between event labels. This automaton is then pruned from arcs with low relative frequency and used to remove from the log those events not fitting the automaton, which are identified as outliers. The technique has been extensively evaluated on top of various auto- mated process discovery algorithms using both artificial logs with different levels of noise, as well as a variety of real-life logs. The results show that the technique significantly improves the quality of the discovered process model along fitness, appropriateness and simplicity, without negative effects on generalization. Further, the technique scales well to large and complex logs.
Resumo:
Acid hydrolysis is a popular pretreatment for removing hemicellulose from lignocelluloses in order to produce a digestible substrate for enzymatic saccharification. In this work, a novel model for the dilute acid hydrolysis of hemicellulose within sugarcane bagasse is presented and calibrated against experimental oligomer profiles. The efficacy of mathematical models as hydrolysis yield predictors and as vehicles for investigating the mechanisms of acid hydrolysis is also examined. Experimental xylose, oligomer (degree of polymerisation 2 to 6) and furfural yield profiles were obtained for bagasse under dilute acid hydrolysis conditions at temperatures ranging from 110C to 170C. Population balance kinetics, diffusion and porosity evolution were incorporated into a mathematical model of the acid hydrolysis of sugarcane bagasse. This model was able to produce a good fit to experimental xylose yield data with only three unknown kinetic parameters ka, kb and kd. However, fitting this same model to an expanded data set of oligomeric and furfural yield profiles did not successfully reproduce the experimental results. It was found that a ``hard-to-hydrolyse'' parameter, $\alpha$, was required in the model to ensure reproducibility of the experimental oligomer profiles at 110C, 125C and 140C. The parameters obtained through the fitting exercises at lower temperatures were able to be used to predict the oligomer profiles at 155C and 170C with promising results. The interpretation of kinetic parameters obtained by fitting a model to only a single set of data may be ambiguous. Although these parameters may correctly reproduce the data, they may not be indicative of the actual rate parameters, unless some care has been taken to ensure that the model describes the true mechanisms of acid hydrolysis. It is possible to challenge the robustness of the model by expanding the experimental data set and hence limiting the parameter space for the fitting parameters. The novel combination of ``hard-to-hydrolyse'' and population balance dynamics in the model presented here appears to stand up to such rigorous fitting constraints.
Resumo:
The monitoring of the actual activities of daily living of individuals with lower limb amputation is essential for an evidence-based fitting of the prosthesis, more particularly the choice of components (e.g., knees, ankles, feet)[1-4]. The purpose of this presentation was to give an overview of the categorization of the load regime data to assess the functional output and usage of the prosthesis of lower limb amputees has presented in several publications[5, 6]. The objectives were to present a categorization of load regime and to report the results for a case.
Resumo:
The understanding of the load applied on the residuum through the prosthesis of individuals with transfemoral amputation (TFA) is essential to address a number of concerns that could strongly reduce their quality of life (e.g., residuum skin lesion, prosthesis fitting, alignment). This inner prosthesis loading could be estimated using a typical gait laboratory relying on inverse dynamics equations. Alternative, technological advances proposed over the last decade enabled direct measurement of this kinetic information in a broad variety of situations that could potentially be more relevant in clinical settings. The purposes of this presentation are (A) to review the literature about recent developments in measure and analyses of inner prosthesis loading of TFA, and (B) to extract information that could potentially contribute to a better evidence-based practice.
Resumo:
Theatre is a socially and politically aware artform. It participates in the construction – and, potentially, the contestation – of a community’s history, identity, and ideals. It does this live, in the moment, where artist, artwork and audience meet here, now, together. This, most theatre makers think, gives theatre special power to make spectators think about the stories it stages. But it also creates challenges.
Resumo:
Pilot and industrial scale dilute acid pretreatment data can be difficult to obtain due to the significant infrastructure investment required. Consequently, models of dilute acid pretreatment by necessity use laboratory scale data to determine kinetic parameters and make predictions about optimal pretreatment conditions at larger scales. In order for these recommendations to be meaningful, the ability of laboratory scale models to predict pilot and industrial scale yields must be investigated. A mathematical model of the dilute acid pretreatment of sugarcane bagasse has previously been developed by the authors. This model was able to successfully reproduce the experimental yields of xylose and short chain xylooligomers obtained at the laboratory scale. In this paper, the ability of the model to reproduce pilot scale yield and composition data is examined. It was found that in general the model over predicted the pilot scale reactor yields by a significant margin. Models that appear very promising at the laboratory scale may have limitations when predicting yields on a pilot or industrial scale. It is difficult to comment whether there are any consistent trends in optimal operating conditions between reactor scale and laboratory scale hydrolysis due to the limited reactor datasets available. Further investigation is needed to determine whether the model has some efficacy when the kinetic parameters are re-evaluated by parameter fitting to reactor scale data, however, this requires the compilation of larger datasets. Alternatively, laboratory scale mathematical models may have enhanced utility for predicting larger scale reactor performance if bulk mass transport and fluid flow considerations are incorporated into the fibre scale equations. This work reinforces the need for appropriate attention to be paid to pilot scale experimental development when moving from laboratory to pilot and industrial scales for new technologies.
Resumo:
The DC9 workshop takes place on June 27, 2015 in Limerick, Ireland and is titled “Hackable Cities: From Subversive City Making to Systemic Change”. The notion of “hacking” originates from the world of media technologies but is increasingly often being used for creative ideals and practices of city making. “City hacking” evokes more participatory, inclusive, decentralized, playful and subversive alternatives to often top-down ICT implementations in smart city making. However, these discourses about “hacking the city” are used ambiguously and are loaded with various ideological presumptions, which makes the term also problematic. For some “urban hacking” is about empowering citizens to organize around communal issues and perform aesthetic urban interventions. For others it raises questions about governance: what kind of “city hacks” should be encouraged or not, and who decides? Can city hacking be curated? For yet others, trendy participatory buzzwords like these are masquerades for deeply libertarian neoliberal values. Furthermore, a question is how “city hacking” may mature from the tactical level of smart and often playful interventions to the strategic level of enduring impact. The Digital Cities 9 workshop welcomes papers that explore the idea of “hackable city making” in constructive and critical ways.
Resumo:
The future of civic engagement is characterised by both technological innovation as well as new technological user practices that are fuelled by trends towards mobile, personal devices; broadband connectivity; open data; urban interfaces; and, cloud computing. These technology trends are progressing at a rapid pace, and have led global technology vendors to package and sell the ‘Smart City’ as a centralized service delivery platform predicted to optimize and enhance cities’ key performance indicators – and generate a profitable market. The top-down deployment of these large and proprietary technology platforms have helped sectors such as energy, transport, and healthcare to increase efficiencies. However, an increasing number of scholars and commentators warn of another ‘IT bubble’ emerging. Along with some city leaders, they argue that the top-down approach does not fit the governance dynamics and values of a liberal democracy when applied across sectors. A thorough understanding is required, of the socio-cultural nuances of how people work, live, play across different environments, and how they employ social media and mobile devices to interact with, engage in, and constitute public realms. Although the term ‘slacktivism’ is sometimes used to denote a watered down version of civic engagement and activism that is reduced to clicking a ‘Like’ button and signing online petitions, we believe that we are far from witnessing another Biedermeier period that saw people focus on the domestic and the non-political. There is plenty of evidence to the contrary, such as post-election violence in Kenya in 2008, the Occupy movements in New York, Hong Kong and elsewhere, the Arab Spring, Stuttgart 21, Fukushima, the Taksim Gezi Park in Istanbul, and the Vinegar Movement in Brazil in 2013. These examples of civic action shape the dynamics of governments, and in turn, call for new processes to be incorporated into governance structures. Participatory research into these new processes across the triad of people, place and technology is a significant and timely investment to foster productive, sustainable, and livable human habitats. With this chapter, we want to reframe the current debates in academia and priorities in industry and government to allow citizens and civic actors to take their rightful centerpiece place in civic movements. This calls for new participatory approaches for co-inquiry and co-design. It is an evolving process with an explicit agenda to facilitate change, and we propose participatory action research (PAR) as an indispensable component in the journey to develop new governance infrastructures and practices for civic engagement. This chapter proposes participatory action research as a useful and fitting research paradigm to guide methodological considerations surrounding the study, design, development, and evaluation of civic technologies. We do not limit our definition of civic technologies to tools specifically designed to simply enhance government and governance, such as renewing your car registration online or casting your vote electronically on election day. Rather, we are interested in civic media and technologies that foster citizen engagement in the widest sense, and particularly the participatory design of such civic technologies that strive to involve citizens in political debate and action as well as question conventional approaches to political issues (DiSalvo, 2012; Dourish, 2010; Foth et al., 2013). Following an outline of some underlying principles and assumptions behind participatory action research, especially as it applies to cities, we will critically review case studies to illustrate the application of this approach with a view to engender robust, inclusive, and dynamic societies built on the principles of engaged liberal democracy. The rationale for this approach is an alternative to smart cities in a ‘perpetual tomorrow,’ (cf. e.g. Dourish & Bell, 2011), based on many weak and strong signals of civic actions revolving around technology seen today. It seeks to emphasize and direct attention to active citizenry over passive consumerism, human actors over human factors, culture over infrastructure, and prosperity over efficiency. First, we will have a look at some fundamental issues arising from applying simplistic smart city visions to the kind of a problem a city is (cf. Jacobs, 1961). We focus on the touch points between “the city” and its civic body, the citizens. In order to provide for meaningful civic engagement, the city must provide appropriate interfaces.
Resumo:
The American Association of Australasian Literary Studies (AAALS) Annual Conference, Forth Worth, Texas, 9–11 April 2015. The dark fluidity of Melbourne suburbia in Sonya Hartnett’s Butterfly Sonya Hartnett’s Butterfly (2009) is a fictional account of the suburban family life of the Coyles in 1980’s outer suburban Melbourne written from the perspective of teenager Plum Coyle. The Coyle family at first glance appear to be living a textbook example of the suburban lifestyle developed from the 19th century and sustained well into the twentieth century, in which housing design and gender roles were clearly defined and “connected with a normative heterosexuality” (Johnson 2000: 94). The Australian suburban space is also well documented as a place where people often have to contend with oppressive rigid social and cultural ideals (ie Rowse 1978, Johnson 1993, Turnbull 2008, and Flew 2011). There is a tendency to group “suburb” as one monolithic space but this paper will argue that Hartnett exposes the dark fluidity and the complexity of the term, just as she reveals that despite or perhaps because of the planned nature of suburbia, the lives that people live are often just as complex.
Size-resolved particle distribution and gaseous concentrations by real-world road tunnel measurement
Resumo:
Measurements of aerosol particle number size distributions (15-700 nm), CO and NOx were performed in a bus tunnel, Australia. Daily mean particle size distributions of mixed diesel/CNG (Compressed Natural Gas) buses traffic flow were determined in 4 consecutive measurement days. EFs (Emission Factors) of Particle size distribution of diesel buses and CNG buses were obtained by MLR (Multiple Linear Regression) methods, particle distributions of diesel buses and CNG buses were observed as single accumulation mode and nuclei-mode separately. Particle size distributions of mixed traffic flow were decomposed by two log-normal fitting curves for each 30 minutes interval mean scans, all the mix fleet PSD emission can be well fitted by the summation of two log-normal distribution curves, and these were composed of nuclei mode curve and accumulation curve, which were affirmed as the CNG buses and diesel buses PN emission curves respectively. Finally, particle size distributions of diesel buses and CNG buses were quantified by statistical whisker-box charts. For log-normal particle size distribution of diesel buses, accumulation mode diameters were 74.5~87.5nm, geometric standard deviations were 1.89~1.98. As to log-normal particle size distribution of CNG buses, nuclei-mode diameters were 21~24 nm, geometric standard deviations were 1.27~1.31.
Resumo:
This project constructed virtual plant leaf surfaces from digitised data sets for use in droplet spray models. Digitisation techniques for obtaining data sets for cotton, chenopodium and wheat leaves are discussed and novel algorithms for the reconstruction of the leaves from these three plant species are developed. The reconstructed leaf surfaces are included into agricultural droplet spray models to investigate the effect of the nozzle and spray formulation combination on the proportion of spray retained by the plant. A numerical study of the post-impaction motion of large droplets that have formed on the leaf surface is also considered.
Resumo:
This thesis introduces a new way of using prior information in a spatial model and develops scalable algorithms for fitting this model to large imaging datasets. These methods are employed for image-guided radiation therapy and satellite based classification of land use and water quality. This study has utilized a pre-computation step to achieve a hundredfold improvement in the elapsed runtime for model fitting. This makes it much more feasible to apply these models to real-world problems, and enables full Bayesian inference for images with a million or more pixels.
Resumo:
Purpose To evaluate the influence of cone location and corneal cylinder on RGP corrected visual acuities and residual astigmatism in patients with keratoconus. Methods In this prospective study, 156 eyes from 134 patients were enrolled. Complete ophthalmologic examination including manifest refraction, Best spectacle visual acuity (BSCVA), slit-lamp biomicroscopy was performed and corneal topography analysis was done. According to the cone location on the topographic map, the patients were divided into central and paracentral cone groups. Trial RGP lenses were selected based on the flat Sim K readings and a ‘three-point touch’ fitting approach was used. Over contact lens refraction was performed, residual astigmatism (RA) was measured and best-corrected RGP visual acuities (RGPVA) were recorded. Results The mean age (±SD) was 22.1 ± 5.3 years. 76 eyes (48.6%) had central and 80 eyes (51.4%) had paracentral cone. Prior to RGP lenses fitting mean (±SD) subjective refraction spherical equivalent (SRSE), subjective refraction astigmatism (SRAST) and BSCVA (logMAR) were −5.04 ± 2.27 D, −3.51 ± 1.68 D and 0.34 ± 0.14, respectively. There were statistically significant differences between central and paracentral cone groups in mean values of SRSE, SRAST, flat meridian (Sim K1), steep meridian (Sim K2), mean K and corneal cylinder (p-values < 0.05). Comparison of BSCVA to RGPVA shows that vision has improved 0.3 logMAR by RGP lenses (p < 0.0001). Mean (±SD) RA was −0.72 ± 0.39 D. There were no statistically significant differences between RGPVAs and RAs of central and paracentral cone groups (p = 0.22) and (p = 0.42), respectively. Pearson's correlation analysis shows that there is a statistically significant relationship between corneal cylinder and BSCVA and RGPVA, However, the relationship between corneal cylinder and residual astigmatism was not significant. Conclusions Cone location has no effect on the RGP corrected visual acuities and residual astigmatism in patients with keratoconus. Corneal cylinder and Sim K values influence RGP-corrected visual acuities but do not influence residual astigmatism.