915 resultados para Performanced Based Design
Resumo:
OBJECTIVE: Renal resistive index (RRI) varies directly with renal vascular stiffness and pulse pressure. RRI correlates positively with arteriolosclerosis in damaged kidneys and predicts progressive renal dysfunction. Matrix Gla-protein (MGP) is a vascular calcification inhibitor that needs vitamin K to be activated. Inactive MGP, known as desphospho-uncarboxylated MGP (dp-ucMGP), can be measured in plasma and has been associated with various cardiovascular (CV) markers, CV outcomes and mortality. In this study we hypothesize that increased RRI is associated with high levels of dp-ucMGP. DESIGN AND METHOD: We recruited participants via a multi-center family-based cross-sectional study in Switzerland exploring the role of genes and kidney hemodynamics in blood pressure regulation. Dp-ucMGP was quantified in plasma samples by sandwich ELISA. Renal doppler sonography was performed using a standardized protocol to measure RRIs on 3 segmental arteries in each kidney. The mean of the 6 measures was reported. Multiple regression analysis was performed to estimate associations between RRI and dp-ucMGP adjusting for sex, age, pulse pressure, mean pressure, renal function and other CV risk factors. RESULTS: We included 1035 participants in our analyses. Mean values were 0.64 ± 0.06 for RRI and 0.44 ± 0.21 (nmol/L) for dp-ucMGP. RRI was positively associated with dp-ucMGP both before and after adjustment for sex, age, body mass index, pulse pressure, mean pressure, heart rate, renal function, low and high density lipoprotein, smoking status, diabetes, blood pressure and cholesterol lowering drugs, and history of CV disease (P < 0.001). CONCLUSIONS: RRI is independently and positively associated with high levels of dp-ucMGP after adjustment for pulse pressure and common CV risk factors. Further studies are needed to determine if vitamin K supplementation can have a positive effect on renal vascular stiffness and kidney function.
Resumo:
During the last half decade the popularity of different peer-to-peer applications has grown tremendously. Traditionally only desktop-class computers with fixed line network connections have been powerful enough to utilize peer-to-peer. However, the situation is about to change. The rapid development of wireless terminals will soon enable peer-to-peer applications on these devices as well as on desktops. Possibilities are further enhanced by the upcoming high-bandwidth cellular networks. In this thesis the applicability and implementation alternatives of an existing peer-to-peer system are researched for two target platforms: Linux powered iPaq and Symbian OS based smartphone. The result is a peer-to-peer middleware component suitable for mobile terminals. It works on both platforms and utilizes Bluetooth networking technology. The implemented software platforms are compatible with each other and support for additional network technologies can be added with a minimal effort.
Resumo:
We present a dual-trap optical tweezers setup which directly measures forces using linear momentum conservation. The setup uses a counter-propagating geometry, which allows momentum measurement on each beam separately. The experimental advantages of this setup include low drift due to all-optical manipulation, and a robust calibration (independent of the features of the trapped object or buffer medium) due to the force measurement method. Although this design does not attain the high-resolution of some co-propagating setups, we show that it can be used to perform different single molecule measurements: fluctuation-based molecular stiffness characterization at different forces and hopping experiments on molecular hairpins. Remarkably, in our setup it is possible to manipulate very short tethers (such as molecular hairpins with short handles) down to the limit where beads are almost in contact. The setup is used to illustrate a novel method for measuring the stiffness of optical traps and tethers on the basis of equilibrium force fluctuations, i.e., without the need of measuring the force vs molecular extension curve. This method is of general interest for dual trap optical tweezers setups and can be extended to setups which do not directly measure forces.
Resumo:
This master’s thesis aims to study and represent from literature how evolutionary algorithms are used to solve different search and optimisation problems in the area of software engineering. Evolutionary algorithms are methods, which imitate the natural evolution process. An artificial evolution process evaluates fitness of each individual, which are solution candidates. The next population of candidate solutions is formed by using the good properties of the current population by applying different mutation and crossover operations. Different kinds of evolutionary algorithm applications related to software engineering were searched in the literature. Applications were classified and represented. Also the necessary basics about evolutionary algorithms were presented. It was concluded, that majority of evolutionary algorithm applications related to software engineering were about software design or testing. For example, there were applications about classifying software production data, project scheduling, static task scheduling related to parallel computing, allocating modules to subsystems, N-version programming, test data generation and generating an integration test order. Many applications were experimental testing rather than ready for real production use. There were also some Computer Aided Software Engineering tools based on evolutionary algorithms.
Resumo:
Diplomityössä luodaan viitekehys tuotetiedonhallintajärjestelmän esisuunnittelua varten. Siinä on kolme ulottuvuutta: lisäarvontuotto-, toiminnallisuus- ja ohjelmistoulottuvuus. Viitekehys auttaa- tunnistamaan lisäarvontuottokomponentit, joihin voidaan vaikuttaa tiettyjen ohjelmistoluokkien tarjoamilla tuotetiedonhallintatoiminnallisuuksilla. Viitekehyksen järjestelmäsuunnittelullista näkökulmaa hyödynnetään tutkittavissa yritystapauksissa perustuen laskentamatriisin muotoon mallinnettuihin ulottuvuuksien välisiin suhteisiin. Matriisiin syötetään lisäarvontuotto- ja toiminnallisuuskomponenttien saamat tärkeydet kohdeyrityksessä suoritetussa haastattelututkimuksessa. Matriisin tuotos on tietyn ohjelmiston soveltuvuus kyseisen yrityksen tapauksessa. Soveltuvuus on joukko tunnuslukuja, jotka analysoidaan tulostenkäsittelyvaiheessa. Soveltuvuustulokset avustavat kohdeyritystä sen valitessa lähestymistapaansa tuotetiedonhallintaan - ja kuvaavat esisuunnitellun tuotetiedonhallintajärjestelmän. Viitekehyksen rakentaminen vaatii perinpohjaisen lähestymistavan merkityksellisten lisäarvontuotto- ja toiminnallisuuskomponenttien sekä ohjelmistoluokkien määrittämiseen. Määritystyö perustuu työssä yksityiskohtaisesti laadittujen menetelmien ja komponenttiryhmitysten hyödyntämiselle. Kunkin alueen analysointi mahdollistaa viitekehyksen ja laskentamatriisin rakentamisen yhdenmukaisten määritysten perusteella. Viitekehykselle on ominaista sen muunneltavuus. Nykymuodossaan se soveltuu elektroniikka- ja high-tech yrityksille. Viitekehystä voidaan hyödyntää myös muilla toimialoilla muokkaamalla lisäarvontuottokomponentteja kunkin toimialan intressien mukaisesti. Vastaavasti analysoitava ohjelmisto voidaan valita tapauskohtaisesti. Laskentamatriisi on kuitenkin ensin päivitettävä valitun ohjelmiston kyvykkyyksillä, minkä jälkeen viitekehys voi tuottaa soveltuvuustuloksia kyseiseen yritystapaukseen perustuen
Resumo:
This study presents an innovative pedagogical approach where teachers become game designers and engage in creative teaching practices. Within co-design training workshops, 21 Spanish primary and secondary school teachers have developed their own Game-Based Learning (GBL) scenarios, especially tailored to their teaching contexts and students profiles. In total, teachers developed 13 GBL scenarios and put them into practice in real teaching contexts. The present paper analyses the impacts of this learner-centred game design approach on teachers" creativity from three different points of view: the GBL design process, the GBL scenario, and the teaching processes at stake.
Resumo:
Teollusuussovelluksissa vaaditaan nykyisin yhä useammin reaaliaikaista tiedon käsittelyä. Luotettavuus on yksi tärkeimmistä reaaliaikaiseen tiedonkäsittelyyn kykenevän järjestelmän ominaisuuksista. Sen saavuttamiseksi on sekä laitteisto, että ohjelmisto testattava. Tämän työn päätavoitteena on laitteiston testaaminen ja laitteiston testattavuus, koska luotettava laitteistoalusta on perusta tulevaisuuden reaaliaikajärjestelmille. Diplomityössä esitetään digitaaliseen signaalinkäsittelyyn soveltuvan prosessorikortin suunnittelu. Prosessorikortti on tarkoitettu sähkökoneiden ennakoivaa kunnonvalvontaa varten. Uusimmat DFT (Desing for Testability) menetelmät esitellään ja niitä sovelletaan prosessorikortin sunnittelussa yhdessä vanhempien menetelmien kanssa. Kokemukset ja huomiot menetelmien soveltuvuudesta raportoidaan työn lopussa. Työn tavoitteena on kehittää osakomponentti web -pohjaiseen valvontajärjestelmään, jota on kehitetty Sähkötekniikan osastolla Lappeenrannan teknillisellä korkeakoululla.
Resumo:
To involve citizens in developing the processes of city making is an objective that occupies part of the agenda of political parties in the context of the necessary renewal in representative democracy. This paper aims to provide some answers to the following questions: Is it possible to overcome the participatory processes based exclusively on the consultation? Is it possible to"train" residents to take an active role in decision-making? How can we manage, proactively, the relationship between public actors, technicians and politicians, in a participatory process? We analyse the process development for creating the Wall of Remembrance in the Barcelona neighbourhood of Baró de Viver, a work of public art, created and produced by its neighbours, in the context of a long participatory process focused on changing the image of the neighbourhood and the improvement of public space. This result and this process have been possible in a given context of cooperation among neighbours, local government and the research team (CR-Polis, Art, City, Society at the University of Barcelona). The development of a creative process of citizen participation between 2004 and 2011 made possible the direct management of decision making by the residents on the field of the design of public space in the neighbourhood. However, the material results of the process does not overshadow the great achievement of the project: the inclusion of a neighbourhood in taking informed decisions because of their empowerment in public space design and management of their remembrances.
Resumo:
Monet ohjelmistoyritykset ovat alkaneet kiinnittää yhä enemmän huomiota ohjelmistotuotteidensa laatuun. Tämä on johtanut siihen, että useimmat niistä ovat valinneet ohjelmistotestauksen välineeksi, jolla tätä laatua voidaan parantaa. Testausta ei pidä rajoittaa ainoastaan ohjelmistotuotteeseen itseensä, vaan sen tulisi kattaa koko ohjelmiston kehitysprosessi. Validaatiotestauksessa keskitytään varmistamaan, että lopputuote täyttää sille asetetut vaatimukset, kun taas verifikaatiotestausta käytetään ennaltaehkäisevänä testauksena, jolla pyritään poistamaan virheitä jo ennenkuin ne pääsevät lähdekoodiin asti. Työ, johon tämä diplomityö perustuu, tehtiin alkukevään ja kesän aikana vuonna 2003 Necsom Oy:n toimeksiannosta. Necsom on pieni suomalainen ohjelmistoyritys, jonka tutkimus- ja kehitysyksikkö toimii Lappeenrannassa.Tässä diplomityössä tutustutaan aluksi ohjelmistotestaukseen sekä eri tapoihin sen organisoimiseksi. Tämän lisäksi annetaan yleisiä ohjeita testisuunnitelmien ja testaustapausten tekoon, joita onnistunut ja tehokas testaus edellyttää. Kun tämä teoria on käyty läpi, esitetään esimerkkinä kuinka sisäinen ohjelmistotestaus toteutettiin Necsomilla. Lopuksi esitetään johtopäätökset, joihin päädyttiin käytännön testausprosessin seuraamisen jälkeen ja annetaan jatkotoimenpide-ehdotuksia.
Resumo:
BACKGROUND: Diabetes represents an increasing health burden worldwide. In 2010, the Public Health Department of the canton of Vaud (Switzerland) launched a regional diabetes programme entitled "Programme cantonal Diabète" (PcD), with the objectives to both decrease the incidence of diabetes and improve care for patients with diabetes. The cohort entitled CoDiab-VD emerged from that programme. It specifically aimed at following quality of diabetes care over time, at evaluating the coverage of the PcD within this canton and at assessing the impact of the PcD on care of patients with diabetes. METHODS/DESIGN: The cohort CoDiab-VD is a prospective population-based cohort study. Patients with diabetes were recruited in two waves (autumn 2011--summer 2012) through community pharmacies. Eligible participants were non-institutionalised adult patients (≥ 18 years) with diabetes diagnosed for at least one year, residing in the canton of Vaud and coming to a participating pharmacy with a diabetes-related prescription. Women with gestational diabetes, people with obvious cognitive impairment or insufficient command of French were not eligible. Self-reported data collected, included the following primary outcomes: processes-of-care indicators (annual checks) and outcomes of care such as HbA1C, (health-related) quality of life measures (Short Form-12 Health Survey--SF-12, Audit of Diabetes-Dependent Quality of Life 19--ADDQoL) and Patient Assessment of Chronic Illness Care (PACIC). Data on diabetes, health status, healthcare utilisation, health behaviour, self-management activities and support, knowledge of, or participation to, campaigns/activities proposed by the PcD, and socio-demographics were also obtained. For consenting participants, physicians provided few additional pieces of information about processes and laboratory results. Participants will be followed once a year, via a mailed self-report questionnaire. The core of the follow-up questionnaires will be similar to the baseline one, with the addition of thematic modules adapting to the development of the PcD. Physicians will be contacted every 2 years. DISCUSSION: CoDiab-VD will allow obtaining a broad picture of the care of patients with diabetes, as well as their needs regarding their chronic condition. The data will be used to evaluate the PcD and help prioritise targeted actions. TRIAL REGISTRATION: This study is registered with ClinicalTrials.gov, identifier NCT01902043, July 9, 2013.
Resumo:
This thesis presents the design and implementation of a GPS-signal source suitable for receiver measurements. The developed signal source is based on direct digital synthesis which generates the intermediate frequency. The intermediate frequency is transfered to the final frequency with the aid of an Inphase/Quadrature modulator. The modulating GPS-data was generated with MATLAB. The signal source was duplicated to form a multi channel source. It was shown that, GPS-signals ment for civil navigation are easy to generate in the laboratory. The hardware does not need to be technically advanced if navigation with high level of accuracy is not needed. It was also shown that, the Inphase/Quadrature modulator can function as a single side band upconverter even with a high intermediate frequency. This concept reduces the demands required for output filtering.
Resumo:
Several possible methods of increasing the efficiency and power of hydro power plants by improving the flow passages are investigated in this stydy. The theoretical background of diffuser design and its application to the optimisation of hydraulic turbine draft tubes is presented in the first part of this study. Several draft tube modernisation projects that have been carried out recently are discussed. Also, a method of increasing the efficiency of the draft tube by injecting a high velocity jet into the boundary layer is presented. Methods of increasing the head of a hydro power plant by using an ejector or a jet pump are discussed in the second part of this work. The theoretical principles of various ejector and jet pump types are presented and four different methods of calculating them are examined in more detail. A self-made computer code is used to calculate the gain in the head for two example power plants. Suitable ejector installations for the example plants are also discussed. The efficiency of the ejector power was found to be in the range 6 - 15 % for conventional head increasers, and 30 % for the jet pump at its optimum operating point. In practice, it is impossible to install an optimised jet pump with a 30 % efficiency into the draft tube as this would considerabely reduce the efficiency of the draft tube at normal operating conditions. This demonstrates, however, the potential for improvement which lies in conventional head increaser technology. This study is based on previous publications and on published test results. No actual laboratory measurements were made for this study. Certain aspects of modelling the flow in the draft tube using computational fluid dynamics are discussed in the final part of this work. The draft tube inlet velocity field is a vital boundary condition for such a calculation. Several previously measured velocity fields that have successfully been utilised in such flow calculations are presented herein.
Resumo:
Within the latest decade high-speed motor technology has been increasingly commonly applied within the range of medium and large power. More particularly, applications like such involved with gas movement and compression seem to be the most important area in which high-speed machines are used. In manufacturing the induction motor rotor core of one single piece of steel it is possible to achieve an extremely rigid rotor construction for the high-speed motor. In a mechanical sense, the solid rotor may be the best possible rotor construction. Unfortunately, the electromagnetic properties of a solid rotor are poorer than the properties of the traditional laminated rotor of an induction motor. This thesis analyses methods for improving the electromagnetic properties of a solid-rotor induction machine. The slip of the solid rotor is reduced notably if the solid rotor is axially slitted. The slitting patterns of the solid rotor are examined. It is shown how the slitting parameters affect the produced torque. Methods for decreasing the harmonic eddy currents on the surface of the rotor are also examined. The motivation for this is to improve the efficiency of the motor to reach the efficiency standard of a laminated rotor induction motor. To carry out these research tasks the finite element analysis is used. An analytical calculation of solid rotors based on the multi-layer transfer-matrix method is developed especially for the calculation of axially slitted solid rotors equipped with wellconducting end rings. The calculation results are verified by using the finite element analysis and laboratory measurements. The prototype motors of 250 – 300 kW and 140 Hz were tested to verify the results. Utilization factor data are given for several other prototypes the largest of which delivers 1000 kW at 12000 min-1.
Resumo:
In this work we present and analyze the application of an experience of Project Based Learning (PBL) in the matter of Physics II of the Industrial Design university degree (Girona University) during 2005-2006 courses. This methodology was applied to the Electrostatic and Direct Current subjects. Furthermore, evaluation and self evaluation results were shown and the academic results were compared with results obtained in the same subjects applying conventional teaching methods
Resumo:
This thesis evaluates methods for obtaining high performance in applications running on the mobile Java platform. Based on the evaluated methods, an optimization was done to a Java extension API running on top the Symbian operating system. The API provides location-based services for mobile Java applications. As a part of this thesis, the JNI implementation in Symbian OS was also benchmarked. A benchmarking tool was implemented in the analysis phase in order to implement extensive performance test set. Based on the benchmark results, it was noted that the landmarks implementation of the API was performing very slowly with large amounts of data. The existing implementation proved to be very inconvenient for optimization because the early implementers did not take performance and design issues into consideration. A completely new architecture was implemented for the API in order to provide scalable landmark initialization and data extraction by using lazy initialization methods. Additionally, runtime memory consumption was also an important part of the optimization. The improvement proved to be very efficient based on the measurements after the optimization. Most of the common API use cases performed extremely well compared to the old implementation. Performance optimization is an important quality attribute of any piece of software especially in embedded mobile devices. Typically, projects get into trouble with performance because there are no clear performance targets and knowledge how to achieve them. Well-known guidelines and performance models help to achieve good overall performance in Java applications and programming interfaces.