843 resultados para Residual-Based Cointegration Test
Resumo:
Modern sophisticated telecommunication devices require even more and more comprehensive testing to ensure quality. The test case amount to ensure well enough coverage of testing has increased rapidly and this increased demand cannot be fulfilled anymore only by using manual testing. Also new agile development models require execution of all test cases with every iteration. This has lead manufactures to use test automation more than ever to achieve adequate testing coverage and quality. This thesis is separated into three parts. Evolution of cellular networks is presented at the beginning of the first part. Also software testing, test automation and the influence of development model for testing are examined in the first part. The second part describes a process which was used to implement test automation scheme for functional testing of LTE core network MME element. In implementation of the test automation scheme agile development models and Robot Framework test automation tool were used. In the third part two alternative models are presented for integrating this test automation scheme as part of a continuous integration process. As a result, the test automation scheme for functional testing was implemented. Almost all new functional level testing test cases can now be automated with this scheme. In addition, two models for integrating this scheme to be part of a wider continuous integration pipe were introduced. Also shift from usage of a traditional waterfall model to a new agile development based model in testing stated to be successful.
Resumo:
Two simple sensitive and cost-effective spectrophotometric methods are described for the determination of lansoprazole (LPZ) in bulk drug and in capsules using ceric ammonium sulphate (CAS), iron (II), orthophenanthroline and thiocyanate as reagents. In both methods, an acidic solution of lansoprazole is treated with a measured excess of CAS followed by the determination of unreacted oxidant by two procedures involving different reaction schemes. The first method involves the reduction of residual oxidant by a known amount of iron(II), and the unreacted iron(II) is complexed with orthophenanthroline at a raised pH, and the absorbance of the resulting complex measured at 510 nm (method A). In the second method, the unreacted CAS is reduced by excess of iron (II), and the resulting iron (III) is complexed with thiocyanate in the acid medium and the absorbance of the complex measured at 470 nm (method B). In both methods, the amount CAS reacted corresponds to the amount of LPZ. In method A, the absorbance is found to increase linearly with the concentration of LPZ where as in method B a linear decrease in absorbance occurs. The systems obey Beer's law for 2.5-30 and 2.5-25 µg mL-1 for method A and method B, respectively, and the corresponding molar absorptivity values are 8.1×10³ and 1.5×10(4) L mol-1cm-1 . The methods were successfully applied to the determination of LPZ in capsules and the results tallied well with the label claim. No interference was observed from the concomitant substances normally added to capsules.
Lanthanum based high surface area perovskite-type oxide and application in CO and propane combustion
Resumo:
The perovskite-type oxides using transition metals present a promising potential as catalysts in total oxidation reaction. The present work investigates the effect of synthesis by oxidant co-precipitation on the catalytic activity of perovskite-type oxides LaBO3 (B= Co, Ni, Mn) in total oxidation of propane and CO. The perovskite-type oxides were characterized by means of X-ray diffraction, nitrogen adsorption (BET method), thermo gravimetric and differential thermal analysis (ATG-DTA) and X-ray photoelectron spectroscopy (XPS). Through a method involving the oxidant co-precipitation it's possible to obtain catalysts with different BET surface areas, of 33-44 m²/g, according the salts of metal used. The characterization results proved that catalysts have a perovskite phase as well as lanthanum oxide, except LaMnO3, that presents a cationic vacancies and generation for known oxygen excess. The results of catalytic test showed that all oxides have a specific catalytic activity for total oxidation of CO and propane even though the temperatures for total conversion change for each transition metal and substance to be oxidized.
Resumo:
New luminometric particle-based methods were developed to quantify protein and to count cells. The developed methods rely on the interaction of the sample with nano- or microparticles and different principles of detection. In fluorescence quenching, timeresolved luminescence resonance energy transfer (TR-LRET), and two-photon excitation fluorescence (TPX) methods, the sample prevents the adsorption of labeled protein to the particles. Depending on the system, the addition of the analyte increases or decreases the luminescence. In the dissociation method, the adsorbed protein protects the Eu(III) chelate on the surface of the particles from dissociation at a low pH. The experimental setups are user-friendly and rapid and do not require hazardous test compounds and elevated temperatures. The sensitivity of the quantification of protein (from 40 to 500 pg bovine serum albumin in a sample) was 20-500-fold better than in most sensitive commercial methods. The quenching method exhibited low protein-to-protein variability and the dissociation method insensitivity to the assay contaminants commonly found in biological samples. Less than ten eukaryotic cells were detected and quantified with all the developed methods under optimized assay conditions. Furthermore, two applications, the method for detection of the aggregation of protein and the cell viability test, were developed by utilizing the TR-LRET method. The detection of the aggregation of protein was allowed at a more than 10,000 times lower concentration, 30 μg/L, compared to the known methods of UV240 absorbance and dynamic light scattering. The TR-LRET method was combined with a nucleic acid assay with cell-impermeable dye to measure the percentage of dead cells in a single tube test with cell counts below 1000 cells/tube.
Resumo:
To identify formulations of biological agents that enable survival, stability and a good surface distribution of the antagonistic agent, studies that test different application vehicles are necessary. The efficiency of two killer yeasts, Wickerhamomyces anomalus (strain 422) and Meyerozyma guilliermondii (strain 443), associated with five different application vehicles, was assessed for the protection of postharvest papayas. In this study, after 90 days of incubation at 4ºC, W. anomalus (strain 422) and M. guilliermondii (strain 443) were viable with all application vehicles tested. Fruits treated with different formulations (yeasts + application vehicles) had a decreased severity of disease (by at least 30%) compared with untreated fruits. The treatment with W. anomalus (strain 422) + 2% starch lowered disease occurrence by 48.3%. The most efficient treatments using M. guilliermondii (strain 443) were those with 2% gelatin or 2% liquid carnauba wax, both of which reduced anthracnose by 50% in postharvest papayas. Electron micrographs of the surface tissues of the treated fruits showed that all application vehicles provided excellent adhesion of the yeast to the surface. Formulations based on starch (2%), gelatin (2%) and carnauba wax (2%) were the most efficient at controlling fungal diseases in postharvest papayas.
Resumo:
Early identification of beginning readers at risk of developing reading and writing difficulties plays an important role in the prevention and provision of appropriate intervention. In Tanzania, as in other countries, there are children in schools who are at risk of developing reading and writing difficulties. Many of these children complete school without being identified and without proper and relevant support. The main language in Tanzania is Kiswahili, a transparent language. Contextually relevant, reliable and valid instruments of identification are needed in Tanzanian schools. This study aimed at the construction and validation of a group-based screening instrument in the Kiswahili language for identifying beginning readers at risk of reading and writing difficulties. In studying the function of the test there was special interest in analyzing the explanatory power of certain contextual factors related to the home and school. Halfway through grade one, 337 children from four purposively selected primary schools in Morogoro municipality were screened with a group test consisting of 7 subscales measuring phonological awareness, word and letter knowledge and spelling. A questionnaire about background factors and the home and school environments related to literacy was also used. The schools were chosen based on performance status (i.e. high, good, average and low performing schools) in order to include variation. For validation, 64 children were chosen from the original sample to take an individual test measuring nonsense word reading, word reading, actual text reading, one-minute reading and writing. School marks from grade one and a follow-up test half way through grade two were also used for validation. The correlations between the results from the group test and the three measures used for validation were very high (.83-.95). Content validity of the group test was established by using items drawn from authorized text books for reading in grade one. Construct validity was analyzed through item analysis and principal component analysis. The difficulty level of most items in both the group test and the follow-up test was good. The items also discriminated well. Principal component analysis revealed one powerful latent dimension (initial literacy factor), accounting for 93% of the variance. This implies that it could be possible to use any set of the subtests of the group test for screening and prediction. The K-Means cluster analysis revealed four clusters: at-risk children, strugglers, readers and good readers. The main concern in this study was with the groups of at-risk children (24%) and strugglers (22%), who need the most assistance. The predictive validity of the group test was analyzed by correlating the measures from the two school years and by cross tabulating grade one and grade two clusters. All the correlations were positive and very high, and 94% of the at-risk children in grade two were already identified in the group test in grade one. The explanatory power of some of the home and school factors was very strong. The number of books at home accounted for 38% of the variance in reading and writing ability measured by the group test. Parents´ reading ability and the support children received at home for schoolwork were also influential factors. Among the studied school factors school attendance had the strongest explanatory power, accounting for 21% of the variance in reading and writing ability. Having been in nursery school was also of importance. Based on the findings in the study a short version of the group test was created. It is suggested for use in the screening processes in grade one aiming at identifying children at risk of reading and writing difficulties in the Tanzanian context. Suggestions for further research as well as for actions for improving the literacy skills of Tanzanian children are presented.
Resumo:
Tiedollista voimavaraistumista tukeva internet-perustainen ohjaus päiväkirurgisille ortopedisille potilaille Tutkimuksen tarkoituksena oli kehittää tiedollista voimavaraistumista tukeva Internetperustainen potilasohjausohjelma sekä arvioida sitä. Tutkimusprosessi jaettiin kahteen vaiheeseen. Ensimmäisessä vaiheessa luotiin sisältö tiedollista voimavaraistumista tukevalle Internet-perustaiselle ohjaukselle päiväkirurgisia ortopedisia potilaita varten. Toisessa vaiheessa arvioitiin Internet-perustaisen ohjauksen (koeryhmä) hyväksyttävyyttä käyttäjien arvioimana ja ohjauksen tuloksia sekä verrattiin Internet-perustaisen ohjauksen (koeryhmä) tuloksia tiedollisesti voimavaraistumista tukevan sairaanhoitajan välittämään ohjauksen (kontrolliryhmä) tuloksiin. Tutkimuksen tavoitteena oli luoda uusi potilasohjausmuoto joka tarjoaa yksilöllisen, osallistavan ja aikaan ja paikkaan sitomattoman ohjauksen päiväkirurgiseen ortopediseen leikkaukseen tulevalle potilaalle. Tutkimuksen ensimmäisessä vaiheessa käytettiin kuvailevaa ja vertailevaa tutkimusmenetelmää (ennen ja jälkeen testaus). Tutkimukseen osallistui 120 päiväkirurgista ortopedista potilasta joiden tiedon odotuksia ja heille välitettyä tietoa tarkasteltiin. Tutkimuksen ensimmäisen vaiheen tuloksien ja aikaisemman voimavaraistumista käsittävän tiedon perusteella luotiin sisältö tiedollista voimavaraistumista tukevalle Internet-perustaiselle ohjaukselle. Sisältö rakentui voimavaraistavan tiedon kuudesta eri osa-alueesta. Tutkimuksen toisessa vaiheessa käytettiin randomoitua kokeellista tutkimusasetelmaa. Päiväkirurgiseen ortopediseen leikkaukseen tulevat potilaat randomoitiin koeryhmään (n=72) Internetperustaiseen ohjaukseen ja kontrolliryhmään (n=75) sairaanhoitajan välittämään ohjaukseen. Aineisto kerättiin strukturoitujen mittareiden avulla ja tulokset analysoitiin tilastollisesti. Tutkimuksen tulokset osoittavat, että kehitettyä tiedollisesti voimavaraistumista tukevaa Internet-perustaista potilasohjausmenetelmää voidaan suositella käytettäväksi ortopedisten päiväkirurgisten potilaiden ohjauksessa ja potilailla on hyvät mahdollisuudet voimavaraistua tiedollisesti sen avulla. Monipuolista tietoa sisältävä Internet-perustainen ohjaus osoittautui käyttäjien näkökulmasta hyväksyttäväksi. Vaikka Internet ohjauksen hyväksyttävyys koettiin osittain heikommaksi kuin sairaanhoitajan välittämän ohjauksen, potilaat käyttivät nettisivustoa ongelmitta ja arvioivat sen helppokäyttöiseksi. Ohjausmuodolla ei ollut vaikutusta hoidosta aiheutuneisiin kustannuksiin. Sen sijaan kustannuksista organisaatiolle voitiin puolittaa sairaanhoitajan ohjaukseen käyttämä aika Internet-perustaisen ohjauksen avulla. Internet-perustaiseen ohjaukseen osallistuneiden potilaiden tiedon taso ja kokemus tiedon riittävyydestä lisääntyivät ohjauksen jälkeen enemmän kuin sairaanhoitajan välittämään potilasohjaukseen osallistuneiden potilaiden tiedot. Ohjausmuodolla ei ollut vaikutusta potilaiden kokemien tunteiden ja oireiden voimakkuuteen. Yhteenvetona voidaan todeta, että tiedollisesti voimavaraistava Internet-perustaista ohjausta voidaan suositella vaihtoehtoiseksi menetelmäksi sairaanhoitajan välittämälle ohjaukselle päiväkirurgiseen ortopediseen leikkaukseen tuleville potilaille.
Resumo:
The drug discovery process is facing new challenges in the evaluation process of the lead compounds as the number of new compounds synthesized is increasing. The potentiality of test compounds is most frequently assayed through the binding of the test compound to the target molecule or receptor, or measuring functional secondary effects caused by the test compound in the target model cells, tissues or organism. Modern homogeneous high-throughput-screening (HTS) assays for purified estrogen receptors (ER) utilize various luminescence based detection methods. Fluorescence polarization (FP) is a standard method for ER ligand binding assay. It was used to demonstrate the performance of two-photon excitation of fluorescence (TPFE) vs. the conventional one-photon excitation method. As result, the TPFE method showed improved dynamics and was found to be comparable with the conventional method. It also held potential for efficient miniaturization. Other luminescence based ER assays utilize energy transfer from a long-lifetime luminescent label e.g. lanthanide chelates (Eu, Tb) to a prompt luminescent label, the signal being read in a time-resolved mode. As an alternative to this method, a new single-label (Eu) time-resolved detection method was developed, based on the quenching of the label by a soluble quencher molecule when displaced from the receptor to the solution phase by an unlabeled competing ligand. The new method was paralleled with the standard FP method. It was shown to yield comparable results with the FP method and found to hold a significantly higher signal-tobackground ratio than FP. Cell-based functional assays for determining the extent of cell surface adhesion molecule (CAM) expression combined with microscopy analysis of the target molecules would provide improved information content, compared to an expression level assay alone. In this work, immune response was simulated by exposing endothelial cells to cytokine stimulation and the resulting increase in the level of adhesion molecule expression was analyzed on fixed cells by means of immunocytochemistry utilizing specific long-lifetime luminophore labeled antibodies against chosen adhesion molecules. Results showed that the method was capable of use in amulti-parametric assay for protein expression levels of several CAMs simultaneously, combined with analysis of the cellular localization of the chosen adhesion molecules through time-resolved luminescence microscopy inspection.
Resumo:
Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.
Resumo:
As long as the incidence of stroke continues to grow, patients with large right hemisphere lesions suffering from hemispatial neglect will require neuropsychological evaluation and rehabilitation. The inability to process information especially that coming from the left side accompanied by the magnetic orientation to the ipsilesional side represents a real challenge for rehabilitation. This dissertation is concerned with crucial aspects in the clinical neuropsychological practice of hemispatial neglect. In studying the convergence of the visual and behavioural test batteries in the assessment of neglect, nine of the seventeen patients, who completed both the conventional subtests of the Behavioural Inattention Test and the Catherine Bergego Scale assessments, showed a similar severity of neglect and thus good convergence in both tests. However, patients with neglect and hemianopia had poorer scores in the line bisection test and they displayed stronger neglect in behaviour than patients with pure neglect. The second study examined, whether arm activation, modified from the Constraint Induced Movement Therapy, could be applied as neglect rehabilitation alone without any visual training. Twelve acute- or subacute patients were randomized into two rehabilitation groups: arm activation training or traditional voluntary visual scanning training. Neglect was ameliorated significantly or almost significantly in both training groups due to rehabilitation with the effect being maintained for at least six months. In studying the reflections of hemispatial neglect on visual memory, the associations of severity of neglect and visual memory performances were explored. The performances of acute and subacute patients with hemispatial neglect were compared with the performances of matched healthy control subjects. As hypothesized, encoding from the left side and immediate recall of visual material were significantly compromised in patients with neglect. Another mechanism of neglect affecting visual memory processes is observed in delayed visual reproduction. Delayed recall demands that the individual must make a match helped by a cue or it requires a search for relevant material from long-term memory storage. In the case of representational neglect, the search may succeed but the left side of the recollected memory still fails to open. Visual and auditory evoked potentials were measured in 21 patients with hemispatial neglect. Stimuli coming from the left or right were processed differently in both sensory modalities in acute and subacute patients as compared with the chronic patients. The differences equalized during the course of recovery. Recovery from hemispatial neglect was strongly associated with early rehabilitation and with the severity of neglect. Extinction was common in patients with neglect and it did not ameliorate with the recovery of neglect. The presence of pusher symptom hampered amelioration of visual neglect in acute and subacute stroke patients, whereas depression did not have any significant effect in the early phases after the stroke. However, depression had an unfavourable effect on recovery in the chronic phase. In conclusion, the combination of neglect and hemianopia may explain part of the residual behavioural neglect that is no longer evident in visual testing. Further research is needed in order to determine which specific rehabilitation procedures would be most beneficial in patients suffering the combination of neglect and hemianopia. Arm activation should be included in the rehabilitation programs of neglect; this is a useful technique for patients who need bedside treatment in the acute phase. With respect to the deficit in visual memory in association with neglect, the possible mechanisms of lateralized deficit in delayed recall need to be further examined and clarified. Intensive treatment induced recovery in both severe and moderate visual neglect long after the first two to first three months after the stroke.
Resumo:
Sisal fiber is an important agricultural product used in the manufacture of ropes, rugs and also as a reinforcement of polymeric or cement-based composites. However, during the fiber production process a large amount of residues is generated which currently have a low potential for commercial use. The aim of this study is to characterize the agricultural residues by the production and improvement of sisal fiber, called field bush and refugo and verify the potentiality of their use in the reinforcement of cement-based composites. The residues were treated with wet-dry cycles and evaluated using tensile testing of fibers, scanning electron microscopy (SEM) and Fourier transform infrared (FTIR) spectroscopy. Compatibility with the cement-based matrix was evaluated through the fiber pull-out test and flexural test in composites reinforced with 2 % of sisal residues. The results indicate that the use of treated residue allows the production of composites with good mechanical properties that are superior to the traditional composites reinforced with natural sisal fibers.
Resumo:
CHARGE syndrome, Sotos syndrome and 3p deletion syndrome are examples of rare inherited syndromes that have been recognized for decades but for which the molecular diagnostics only have been made possible by recent advances in genomic research. Despite these advances, development of diagnostic tests for rare syndromes has been hindered by diagnostic laboratories having limited funds for test development, and their prioritization of tests for which a (relatively) high demand can be expected. In this study, the molecular diagnostic tests for CHARGE syndrome and Sotos syndrome were developed, resulting in their successful translation into routine diagnostic testing in the laboratory of Medical Genetics (UTUlab). In the CHARGE syndrome group, mutation was identified in 40.5% of the patients and in the Sotos syndrome group, in 34%, reflecting the use of the tests in routine diagnostics in differential diagnostics. In CHARGE syndrome, the low prevalence of structural aberrations was also confirmed. In 3p deletion syndrome, it was shown that small terminal deletions are not causative for the syndrome, and that testing with arraybased analysis provides a reliable estimate of the deletion size but benign copy number variants complicate result interpretation. During the development of the tests, it was discovered that finding an optimal molecular diagnostic strategy for a given syndrome is always a compromise between the sensitivity, specificity and feasibility of applying a new method. In addition, the clinical utility of the test should be considered prior to test development: sometimes a test performing well in a laboratory has limited utility for the patient, whereas a test performing poorly in the laboratory may have a great impact on the patient and their family. At present, the development of next generation sequencing methods is changing the concept of molecular diagnostics of rare diseases from single tests towards whole-genome analysis.
Resumo:
The maintenance of electric distribution network is a topical question for distribution system operators because of increasing significance of failure costs. In this dissertation the maintenance practices of the distribution system operators are analyzed and a theory for scheduling maintenance activities and reinvestment of distribution components is created. The scheduling is based on the deterioration of components and the increasing failure rates due to aging. The dynamic programming algorithm is used as a solving method to maintenance problem which is caused by the increasing failure rates of the network. The other impacts of network maintenance like environmental and regulation reasons are not included to the scope of this thesis. Further the tree trimming of the corridors and the major disturbance of the network are not included to the problem optimized in this thesis. For optimizing, four dynamic programming models are presented and the models are tested. Programming is made in VBA-language to the computer. For testing two different kinds of test networks are used. Because electric distribution system operators want to operate with bigger component groups, optimal timing for component groups is also analyzed. A maintenance software package is created to apply the presented theories in practice. An overview of the program is presented.
Resumo:
In this doctoral thesis, methods to estimate the expected power cycling life of power semiconductor modules based on chip temperature modeling are developed. Frequency converters operate under dynamic loads in most electric drives. The varying loads cause thermal expansion and contraction, which stresses the internal boundaries between the material layers in the power module. Eventually, the stress wears out the semiconductor modules. The wear-out cannot be detected by traditional temperature or current measurements inside the frequency converter. Therefore, it is important to develop a method to predict the end of the converter lifetime. The thesis concentrates on power-cycling-related failures of insulated gate bipolar transistors. Two types of power modules are discussed: a direct bonded copper (DBC) sandwich structure with and without a baseplate. Most common failure mechanisms are reviewed, and methods to improve the power cycling lifetime of the power modules are presented. Power cycling curves are determined for a module with a lead-free solder by accelerated power cycling tests. A lifetime model is selected and the parameters are updated based on the power cycling test results. According to the measurements, the factor of improvement in the power cycling lifetime of modern IGBT power modules is greater than 10 during the last decade. Also, it is noticed that a 10 C increase in the chip temperature cycle amplitude decreases the lifetime by 40%. A thermal model for the chip temperature estimation is developed. The model is based on power loss estimation of the chip from the output current of the frequency converter. The model is verified with a purpose-built test equipment, which allows simultaneous measurement and simulation of the chip temperature with an arbitrary load waveform. The measurement system is shown to be convenient for studying the thermal behavior of the chip. It is found that the thermal model has a 5 C accuracy in the temperature estimation. The temperature cycles that the power semiconductor chip has experienced are counted by the rainflow algorithm. The counted cycles are compared with the experimentally verified power cycling curves to estimate the life consumption based on the mission profile of the drive. The methods are validated by the lifetime estimation of a power module in a direct-driven wind turbine. The estimated lifetime of the IGBT power module in a direct-driven wind turbine is 15 000 years, if the turbine is located in south-eastern Finland.
Resumo:
PURPOSE: To analyze the prevalence of and factors associated with fragility fractures in Brazilian women aged 50 years and older. METHODS: This cross-sectional population survey, conducted between May 10 and October 31, 2011, included 622 women aged >50 years living in a city in southeastern Brazil. A questionnaire was administered to each woman by a trained interviewer. The associations between the occurrence of a fragility fracture after age 50 years and sociodemographic data, health-related habits and problems, self-perception of health and evaluation of functional capacity were determined by the χ2 test and Poisson regression using the backward selection criteria. RESULTS: The mean age of the 622 women was 64.1 years. The prevalence of fragility fractures was 10.8%, with 1.8% reporting hip fracture. In the final statistical model, a longer time since menopause (PR 1.03; 95%CI 1.01-1.05; p<0.01) and osteoporosis (PR 1.97; 95%CI 1.27-3.08; p<0.01) were associated with a higher prevalence of fractures. CONCLUSIONS: These findings may provide a better understanding of the risk factors associated with fragility fractures in Brazilian women and emphasize the importance of performing bone densitometry.