943 resultados para Software analysis
Resumo:
BACKGROUND: Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM: Our aim was to challenge the validity of these software algorithms. METHODS: We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS: In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION: We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes. Pediatr Pulmonol. 2015; 50:970-977. © 2015 Wiley Periodicals, Inc.
Resumo:
OBJECTIVES: The aims of the study were to use cone beam computed tomography (CBCT) images of nasopalatine duct cysts (NPDC) and to calculate the diameter, surface area, and 3D-volume using a custom-made software program. Furthermore, any associations of dimensions of NPDC with age, gender, presence/absence of maxillary incisors/canines (MI/MC), endodontic treatment of MI/MC, presenting symptoms, and postoperative complications were evaluated. MATERIAL AND METHODS: The study comprised 40 patients with a histopathologically confirmed NPDC. On preoperative CBCT scans, curves delineating the cystic borders were drawn in all planes and the widest diameter (in millimeter), surface area (in square millimeter), and volume (in cubic millimeter) were calculated. RESULTS: The overall mean cyst diameter was 15 mm (range 7-47 mm), the mean cyst surface area 566 mm(2) (84-4,516 mm(2)), and the mean cyst volume 1,735 mm(3) (65-25,350 mm(3)). For 22 randomly allocated cases, a second measurement resulted in a mean absolute aberration of ±4.2 % for the volume, ±2.8 % for the surface, and ±4.9 % for the diameter. A statistically significant association was found for the CBCT determined cyst measurements and the need for preoperative endodontic treatment to MI/MC and for postoperative complications. CONCLUSION: In the hands of a single experienced operator, the novel software exhibited high repeatability for measurements of cyst dimensions. Further studies are needed to assess the application of this tool for dimensional analysis of different jaw cysts and lesions including treatment planning. CLINICAL RELEVANCE: Accurate radiographic information of the bone volume lost (osteolysis) due to expansion of a cystic lesion in three dimensions could help in personalized treatment planning.
Resumo:
Background: We use an approach based on Factor Analysis to analyze datasets generated for transcriptional profiling. The method groups samples into biologically relevant categories, and enables the identification of genes and pathways most significantly associated to each phenotypic group, while allowing for the participation of a given gene in more than one cluster. Genes assigned to each cluster are used for the detection of pathways predominantly activated in that cluster by finding statistically significant associated GO terms. We tested the approach with a published dataset of microarray experiments in yeast. Upon validation with the yeast dataset, we applied the technique to a prostate cancer dataset. Results: Two major pathways are shown to be activated in organ-confined, non-metastatic prostate cancer: those regulated by the androgen receptor and by receptor tyrosine kinases. A number of gene markers (HER3, IQGAP2 and POR1) highlighted by the software and related to the later pathway have been validated experimentally a posteriori on independent samples. Conclusion: Using a new microarray analysis tool followed by a posteriori experimental validation of the results, we have confirmed several putative markers of malignancy associated with peptide growth factor signalling in prostate cancer and revealed others, most notably ERRB3 (HER3). Our study suggest that, in primary prostate cancer, HER3, together or not with HER4, rather than in receptor complexes involving HER2, could play an important role in the biology of these tumors. These results provide new evidence for the role of receptor tyrosine kinases in the establishment and progression of prostate cancer.
Resumo:
This paper presents a prototype of an interactive web-GIS tool for risk analysis of natural hazards, in particular for floods and landslides, based on open-source geospatial software and technologies. The aim of the presented tool is to assist the experts (risk managers) in analysing the impacts and consequences of a certain hazard event in a considered region, providing an essential input to the decision-making process in the selection of risk management strategies by responsible authorities and decision makers. This tool is based on the Boundless (OpenGeo Suite) framework and its client-side environment for prototype development, and it is one of the main modules of a web-based collaborative decision support platform in risk management. Within this platform, the users can import necessary maps and information to analyse areas at risk. Based on provided information and parameters, loss scenarios (amount of damages and number of fatalities) of a hazard event are generated on the fly and visualized interactively within the web-GIS interface of the platform. The annualized risk is calculated based on the combination of resultant loss scenarios with different return periods of the hazard event. The application of this developed prototype is demonstrated using a regional data set from one of the case study sites, Fella River of northeastern Italy, of the Marie Curie ITN CHANGES project.
Resumo:
OBJECTIVE: To evaluate lung fissures completeness, post-treatment radiological response and quantitative CT analysis (QCTA) in a population of severe emphysematous patients submitted to endobronchial valves (EBV) implantation. MATERIALS AND METHODS: Multi-detectors CT exams of 29 patients were studied, using thin-section low dose protocol without contrast. Two radiologists retrospectively reviewed all images in consensus; fissures completeness was estimated in 5% increments and post-EBV radiological response (target lobe atelectasis/volume loss) was evaluated. QCTA was performed in pre and post-treatment scans using a fully automated software. RESULTS: CT response was present in 16/29 patients. In the negative CT response group, all 13 patients presented incomplete fissures, and mean oblique fissures completeness was 72.8%, against 88.3% in the other group. QCTA most significant results showed a reduced post-treatment total lung volume (LV) (mean 542 ml), reduced EBV-submitted LV (700 ml) and reduced emphysema volume (331.4 ml) in the positive response group, which also showed improved functional tests. CONCLUSION: EBV benefit is most likely in patients who have complete interlobar fissures and develop lobar atelectasis. In patients with no radiological response we observed a higher prevalence of incomplete fissures and a greater degree of incompleteness. The fully automated QCTA detected the post-treatment alterations, especially in the treated lung analysis.
Resumo:
AbstractObjective:To define the distal femur rotation pattern in a Brazilian population, correlating such pattern with the one suggested by the arthroplasty instruments, and analyzing the variability of each anatomic parameter.Materials and Methods:A series of 101 magnetic resonance imaging studies were evaluated in the period between April and June 2012. The epidemiological data collection was performed with the aid of the institution's computed imaging system, and the sample included 52 male and 49 female patients. The measurements were made in the axial plane, with subsequent correlation and triangulation with the other plans. The posterior condylar line was used as a reference for angle measurements. Subsequently, the anatomical and surgical transepicondylar axes and the anteroposterior trochlear line were specified. The angles between the reference line and the studied lines were calculated with the aid of the institution's software.Results:The mean angle between the anatomical transepicondylar axis and the posterior condylar line was found to be 6.89°, ranging from 0.25° to 12°. For the surgical transepicondylar axis, the mean value was 2.89°, ranging from –2.23° (internal rotation) to 7.86°, and for the axis perpendicular to the anteroposterior trochlear line, the mean value was 4.77°, ranging from –2.09° to 12.2°.Conclusion:The anatomical transepicondylar angle showed mean values corresponding to the measurement observed in the Caucasian population. The utilized instruments are appropriate, but no anatomical parameter proved to be steady enough to be used in isolation.
Resumo:
Over the last decades, calibration techniques have been widely used to improve the accuracy of robots and machine tools since they only involve software modification instead of changing the design and manufacture of the hardware. Traditionally, there are four steps are required for a calibration, i.e. error modeling, measurement, parameter identification and compensation. The objective of this thesis is to propose a method for the kinematics analysis and error modeling of a newly developed hybrid redundant robot IWR (Intersector Welding Robot), which possesses ten degrees of freedom (DOF) where 6-DOF in parallel and additional 4-DOF in serial. In this article, the problem of kinematics modeling and error modeling of the proposed IWR robot are discussed. Based on the vector arithmetic method, the kinematics model and the sensitivity model of the end-effector subject to the structure parameters is derived and analyzed. The relations between the pose (position and orientation) accuracy and manufacturing tolerances, actuation errors, and connection errors are formulated. Computer simulation is performed to examine the validity and effectiveness of the proposed method.
Resumo:
A flow system coupled to a tungsten coil atomizer in an atomic absorption spectrometer (TCA-AAS) was developed for As(III) determination in waters, by extraction with sodium diethyldithiocarbamate (NaDDTC) as complexing agent, and by sorption of the As(III)-DDTC complex in a micro-column filled with 5 mg C18 reversed phase (10 µL dry sorbent), followed by elution with ethanol. A complete pre-concentration/elution cycle took 208 s, with 30 s sample load time (1.7 mL) and 4 s elution time (71 µL). The interface and software for the synchronous control of two peristaltic pumps (RUN/ STOP), an autosampler arm, seven solenoid valves, one injection valve, the electrothermal atomizer and the spectrometer Read function were constructed. The system was characterized and validated by analytical recovery studies performed both in synthetic solutions and in natural waters. Using a 30 s pre-concentration period, the working curve was linear between 0.25 and 6.0 µg L-1 (r = 0.9976), the retention efficiency was 94±1% (6.0 µg L-1), and the pre-concentration coefficient was 28.9. The characteristic mass was 58 pg, the mean repeatability (expressed as the variation coefficient) was 3.4% (n=5), the detection limit was 0.058 µg L-1 (4.1 pg in 71 µL of eluate injected into the coil), and the mean analytical recovery in natural waters was 92.6 ± 9.5 % (n=15). The procedure is simple, economic, less prone to sample loss and contamination and the useful lifetime of the micro-column was between 200-300 pre-concentration cycles.
Resumo:
Software faults are expensive and cause serious damage, particularly if discovered late or not at all. Some software faults tend to be hidden. One goal of the thesis is to figure out the status quo in the field of software fault elimination since there are no recent surveys of the whole area. Basis for a structural framework is proposed for this unstructured field, paying attention to compatibility and how to find studies. Bug elimination means are surveyed, including bug knowhow, defect prevention and prediction, analysis, testing, and fault tolerance. The most common research issues for each area are identified and discussed, along with issues that do not get enough attention. Recommendations are presented for software developers, researchers, and teachers. Only the main lines of research are figured out. The main emphasis is on technical aspects. The survey was done by performing searches in IEEE, ACM, Elsevier, and Inspect databases. In addition, a systematic search was done for a few well-known related journals from recent time intervals. Some other journals, some conference proceedings and a few books, reports, and Internet articles have been investigated, too. The following problems were found and solutions for them discussed. Quality assurance is testing only is a common misunderstanding, and many checks are done and some methods applied only in the late testing phase. Many types of static review are almost forgotten even though they reveal faults that are hard to be detected by other means. Other forgotten areas are knowledge of bugs, knowing continuously repeated bugs, and lightweight means to increase reliability. Compatibility between studies is not always good, which also makes documents harder to understand. Some means, methods, and problems are considered method- or domain-specific when they are not. The field lacks cross-field research.
Resumo:
The objective of this work was to develop a free access exploratory data analysis software application for academic use that is easy to install and can be handled without user-level programming due to extensive use of chemometrics and its association with applications that require purchased licenses or routines. The developed software, called Chemostat, employs Hierarchical Cluster Analysis (HCA), Principal Component Analysis (PCA), intervals Principal Component Analysis (iPCA), as well as correction methods, data transformation and outlier detection. The data can be imported from the clipboard, text files, ASCII or FT-IR Perkin-Elmer “.sp” files. It generates a variety of charts and tables that allow the analysis of results that can be exported in several formats. The main features of the software were tested using midinfrared and near-infrared spectra in vegetable oils and digital images obtained from different types of commercial diesel. In order to validate the software results, the same sets of data were analyzed using Matlab© and the results in both applications matched in various combinations. In addition to the desktop version, the reuse of algorithms allowed an online version to be provided that offers a unique experience on the web. Both applications are available in English.
Resumo:
This diploma thesis has been done to international organization which takes care from the accounting actions of two major companies. In this organization are used three different purchasing tools which are used when new asset master data is wanted to input to SAP R/3- system. The aim of this thesis is to find out how much changing the user interface of one of these three e-procurement programs will affect to overall efficiency in asset accounting. As an addition will be introduced project framework which can be used in future projects and which help to avoid certain steps in the development process. At the moment data needs to be inputted manually with many useless mouse clicks and data needs to be searched from many various resources which slow down the process. Other organization has better tools at the moment than the myOrders system which is under investigation Research was started by exploring the main improvement areas. After this possible defects were traced. Suggested improvements were thought by exploring literature which has been written from usability design and research. Meanwhile also directional calculations from the benefits of the project were done alongside with the analysis of the possible risks and threats. After this NSN IT approved the changes which they thought was acceptable. The next step was to program them into tool and test them before releasing to production environment. The calculations were made also from implemented improvements and compared them to planned ones From whole project was made a framework which can be utilized also to other similar projects. The complete calculation was not possible because of time schedule of the project. Important observation in the project was that efficiency is not improved not only by changing the GUI but also improving processes without any programming. Feedback from end user should be also listened more in development process. End-user is after all the one who knows the best how the program should look like.
Resumo:
A coupled system simulator, based on analytical circuit equations and a finite element method (FEM) model of the motor has been developed and it is used to analyse a frequency-converterfed industrial squirrel-cage induction motor. Two control systems that emulate the behaviour of commercial direct-torque-controlled (DTC) and vector-controlled industrial frequency converters have been studied, implemented in the simulation software and verified by extensive laboratory tests. Numerous factors that affect the operation of a variable speed drive (VSD) and its energy efficiency have been investigated, and their significance in the simulation of the VSD results has been studied. The dependency of the frequency converter, induction motor and system losses on the switching frequency is investigated by simulations and measurements at different speeds for both the vector control and the DTC. Intensive laboratory measurements have been carried out to verify the simulation results.
Resumo:
Cloud computing enables on-demand network access to shared resources (e.g., computation, networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort. Cloud computing refers to both the applications delivered as services over the Internet and the hardware and system software in the data centers. Software as a service (SaaS) is part of cloud computing. It is one of the cloud service models. SaaS is software deployed as a hosted service and accessed over the Internet. In SaaS, the consumer uses the provider‘s applications running in the cloud. SaaS separates the possession and ownership of software from its use. The applications can be accessed from any device through a thin client interface. A typical SaaS application is used with a web browser based on monthly pricing. In this thesis, the characteristics of cloud computing and SaaS are presented. Also, a few implementation platforms for SaaS are discussed. Then, four different SaaS implementation cases and one transformation case are deliberated. The pros and cons of SaaS are studied. This is done based on literature references and analysis of the SaaS implementations and the transformation case. The analysis is done both from the customer‘s and service provider‘s point of view. In addition, the pros and cons of on-premises software are listed. The purpose of this thesis is to find when SaaS should be utilized and when it is better to choose a traditional on-premises software. The qualities of SaaS bring many benefits both for the customer as well as the provider. A customer should utilize SaaS when it provides cost savings, ease, and scalability over on-premises software. SaaS is reasonable when the customer does not need tailoring, but he only needs a simple, general-purpose service, and the application supports customer‘s core business. A provider should utilize SaaS when it offers cost savings, scalability, faster development, and wider customer base over on-premises software. It is wise to choose SaaS when the application is cheap, aimed at mass market, needs frequent updating, needs high performance computing, needs storing large amounts of data, or there is some other direct value from the cloud infrastructure.
Resumo:
Increase of computational power and emergence of new computer technologies led to popularity of local communications between personal trusted devices. By-turn, it led to emergence of security problems related to user data utilized in such communications. One of the main aspects of the data security assurance is security of software operating on mobile devices. The aim of this work was to analyze security threats to PeerHood, software intended for performing personal communications between mobile devices regardless of underlying network technologies. To reach this goal, risk-based software security testing was performed. The results of the testing showed that the project has several security vulnerabilities. So PeerHood cannot be considered as a secure software. The analysis made in the work is the first step towards the further implementation of PeerHood security mechanisms, as well as taking into account security in the development process of this project.
Resumo:
It is presented a software developed with Delphi programming language to compute the reservoir's annual regulated active storage, based on the sequent-peak algorithm. Mathematical models used for that purpose generally require extended hydrological series. Usually, the analysis of those series is performed with spreadsheets or graphical representations. Based on that, it was developed a software for calculation of reservoir active capacity. An example calculation is shown by 30-years (from 1977 to 2009) monthly mean flow historical data, from Corrente River, located at São Francisco River Basin, Brazil. As an additional tool, an interface was developed to manage water resources, helping to manipulate data and to point out information that it would be of interest to the user. Moreover, with that interface irrigation districts where water consumption is higher can be analyzed as a function of specific seasonal water demands situations. From a practical application, it is possible to conclude that the program provides the calculation originally proposed. It was designed to keep information organized and retrievable at any time, and to show simulation on seasonal water demands throughout the year, contributing with the elements of study concerning reservoir projects. This program, with its functionality, is an important tool for decision making in the water resources management.