18 resultados para Link characteristics
em Universidade do Minho
Resumo:
High-risk human papillomavirus (hrHPV) is an essential cause of cervical carcinoma and is also strongly related to anal cancer development. The hrHPV E6 oncoprotein plays a major role in carcinogenesis. We aimed to evaluate the frequency of hrHPV DNA and E6 oncoprotein in the anuses of women with cervical carcinoma. We analyzed 117 women with cervical cancer and 103 controls for hrHPV and the E6 oncogene. Positive test results for a cervical carcinoma included 66.7 % with hrHPV-16 and 7.7 % with hrHPV-18. One case tested positive for both HPV variants (0.9 %). The samples from the anal canal were positive for HPV-16 in 59.8 % of the cases. Simultaneous presence of HPV in the cervix and anal canal was found in 53.8 % of the cases. Regarding expression of E6 RNA, positivity for HPV-16 in the anal canal was found in 21.2 % of the cases, positivity for HPV-16 in the cervix was found in 75.0 %, and positivity for HPV-18 in the cervix was found in 1.9 %. E6 expression in both the cervix and anal canal was found in 19.2 % of the cases. In the controls, 1 % tested positive for HPV-16 and 0 % for HPV-18. Anal samples from the controls showed a hrHPV frequency of 4.9 % (only HPV16). The presence of hrHPV in the anal canal of women with cervical cancer was detected at a high frequency. We also detected E6 RNA expression in the anal canal of women with cervical cancer, suggesting that these women are at risk for anal hrHPV infection.
Resumo:
\The idea that social processes develop in a cyclical manner is somewhat like a `Lorelei'. Researchers are lured to it because of its theoretical promise, only to become entangled in (if not wrecked by) messy problems of empirical inference. The reasoning leading to hypotheses of some kind of cycle is often elegant enough, yet the data from repeated observations rarely display the supposed cyclical pattern. (...) In addition, various `schools' seem to exist which frequently arrive at di erent conclusions on the basis of the same data." (van der Eijk and Weber 1987:271). Much of the empirical controversies around these issues arise because of three distinct problems: the coexistence of cycles of di erent periodicities, the possibility of transient cycles and the existence of cycles without xed periodicity. In some cases, there are no reasons to expect any of these phenomena to be relevant. Seasonality caused by Christmas is one such example (Wen 2002). In such cases, researchers mostly rely on spectral analysis and Auto-Regressive Moving-Average (ARMA) models to estimate the periodicity of cycles.1 However, and this is particularly true in social sciences, sometimes there are good theoretical reasons to expect irregular cycles. In such cases, \the identi cation of periodic movement in something like the vote is a daunting task all by itself. When a pendulum swings with an irregular beat (frequency), and the extent of the swing (amplitude) is not constant, mathematical functions like sine-waves are of no use."(Lebo and Norpoth 2007:73) In the past, this di culty has led to two di erent approaches. On the one hand, some researchers dismissed these methods altogether, relying on informal alternatives that do not meet rigorous standards of statistical inference. Goldstein (1985 and 1988), studying the severity of Great power wars is one such example. On the other hand, there are authors who transfer the assumptions of spectral analysis (and ARMA models) into fundamental assumptions about the nature of social phenomena. This type of argument was produced by Beck (1991) who, in a reply to Goldstein (1988), claimed that only \ xed period models are meaningful models of cyclic phenomena".We argue that wavelet analysis|a mathematical framework developed in the mid-1980s (Grossman and Morlet 1984; Goupillaud et al. 1984) | is a very viable alternative to study cycles in political time-series. It has the advantage of staying close to the frequency domain approach of spectral analysis while addressing its main limitations. Its principal contribution comes from estimating the spectral characteristics of a time-series as a function of time, thus revealing how its di erent periodic components may change over time. The rest of article proceeds as follows. In the section \Time-frequency Analysis", we study in some detail the continuous wavelet transform and compare its time-frequency properties with the more standard tool for that purpose, the windowed Fourier transform. In the section \The British Political Pendulum", we apply wavelet analysis to essentially the same data analyzed by Lebo and Norpoth (2007) and Merrill, Grofman and Brunell (2011) and try to provide a more nuanced answer to the same question discussed by these authors: do British electoral politics exhibit cycles? Finally, in the last section, we present a concise list of future directions.
Resumo:
This chapter aims at developing a taxonomic framework to classify the studies on the flexible job shop scheduling problem (FJSP). The FJSP is a generalization of the classical job shop scheduling problem (JSP), which is one of the oldest NP-hard problems. Although various solution methodologies have been developed to obtain good solutions in reasonable time for FSJPs with different objective functions and constraints, no study which systematically reviews the FJSP literature has been encountered. In the proposed taxonomy, the type of study, type of problem, objective, methodology, data characteristics, and benchmarking are the main categories. In order to verify the proposed taxonomy, a variety of papers from the literature are classified. Using this classification, several inferences are drawn and gaps in the FJSP literature are specified. With the proposed taxonomy, the aim is to develop a framework for a broad view of the FJSP literature and construct a basis for future studies.
Resumo:
This study deals with the characterization of masonry mortars produced with different binders and sands. Several properties of the mortars were determined, like consistence, compressive and flexural strengths, shrinkage and fracture energy. By varying the type of binder (Portland cement, hydrated lime and hydraulic lime) and the type of sand (natural or artificial), it was possible to draw some conclusions about the influence of the composition on mortars properties. The results showed that the use of Portland cement makes the achievement of high strength classes easier. This was due to the slower hardening of lime compared with cement. The results of fracture energy tests showed much higher values for artificial sand mortars when compared with natural sand ones. This is due to the higher roughness of artificial sand particles which provided better adhesion between sand and binder.
Resumo:
The structural analysis involves the definition of the model and selection of the analysis type. The model should represent the stiffness, the mass and the loads of the structure. The structures can be represented using simplified models, such as the lumped mass models, and advanced models resorting the Finite Element Method (FEM) and Discrete Element Method (DEM). Depending on the characteristics of the structure, different types of analysis can be used such as limit analysis, linear and non-linear static analysis and linear and non-linear dynamic analysis. Unreinforced masonry structures present low tensile strength and the linear analyses seem to not be adequate for assessing their structural behaviour. On the other hand, the static and dynamic non-linear analyses are complex, since they involve large time computational requirements and advanced knowledge of the practitioner. The non-linear analysis requires advanced knowledge on the material properties, analysis tools and interpretation of results. The limit analysis with macro-blocks can be assumed as a more practical method in the estimation of maximum load capacity of structure. Furthermore, the limit analysis require a reduced number of parameters, which is an advantage for the assessment of ancient and historical masonry structures, due to the difficult in obtaining reliable data.
Resumo:
Customer lifetime value (LTV) enables using client characteristics, such as recency, frequency and monetary (RFM) value, to describe the value of a client through time in terms of profitability. We present the concept of LTV applied to telemarketing for improving the return-on-investment, using a recent (from 2008 to 2013) and real case study of bank campaigns to sell long- term deposits. The goal was to benefit from past contacts history to extract additional knowledge. A total of twelve LTV input variables were tested, un- der a forward selection method and using a realistic rolling windows scheme, highlighting the validity of five new LTV features. The results achieved by our LTV data-driven approach using neural networks allowed an improvement up to 4 pp in the Lift cumulative curve for targeting the deposit subscribers when compared with a baseline model (with no history data). Explanatory knowledge was also extracted from the proposed model, revealing two highly relevant LTV features, the last result of the previous campaign to sell the same product and the frequency of past client successes. The obtained results are particularly valuable for contact center companies, which can improve pre- dictive performance without even having to ask for more information to the companies they serve.
Resumo:
Traffic Engineering (TE) approaches are increasingly impor- tant in network management to allow an optimized configuration and resource allocation. In link-state routing, the task of setting appropriate weights to the links is both an important and a challenging optimization task. A number of different approaches has been put forward towards this aim, including the successful use of Evolutionary Algorithms (EAs). In this context, this work addresses the evaluation of three distinct EAs, a single and two multi-objective EAs, in two tasks related to weight setting optimization towards optimal intra-domain routing, knowing the network topology and aggregated traffic demands and seeking to mini- mize network congestion. In both tasks, the optimization considers sce- narios where there is a dynamic alteration in the state of the system, in the first considering changes in the traffic demand matrices and in the latter considering the possibility of link failures. The methods will, thus, need to simultaneously optimize for both conditions, the normal and the altered one, following a preventive TE approach towards robust configurations. Since this can be formulated as a bi-objective function, the use of multi-objective EAs, such as SPEA2 and NSGA-II, came nat- urally, being those compared to a single-objective EA. The results show a remarkable behavior of NSGA-II in all proposed tasks scaling well for harder instances, and thus presenting itself as the most promising option for TE in these scenarios.
Resumo:
Human activity is very dynamic and subtle, and most physical environments are also highly dynamic and support a vast range of social practices that do not map directly into any immediate ubiquitous computing functionally. Identifying what is valuable to people is very hard and obviously leads to great uncertainty regarding the type of support needed and the type of resources needed to create such support. We have addressed the issues of system development through the adoption of a Crowdsourced software development model [13]. We have designed and developed Anywhere places, an open and flexible system support infrastructure for Ubiquitous Computing that is based on a balanced combination between global services and applications and situated devices. Evaluation, however, is still an open problem. The characteristics of ubiquitous computing environments make their evaluation very complex: there are no globally accepted metrics and it is very difficult to evaluate large-scale and long-term environments in real contexts. In this paper, we describe a first proposal of an hybrid 3D simulated prototype of Anywhere places that combines simulated and real components to generate a mixed reality which can be used to assess the envisaged ubiquitous computing environments [17].
Resumo:
Forming suitable learning groups is one of the factors that determine the efficiency of collaborative learning activities. However, only a few studies were carried out to address this problem in the mobile learning environments. In this paper, we propose a new approach for an automatic, customized, and dynamic group formation in Mobile Computer Supported Collaborative Learning (MCSCL) contexts. The proposed solution is based on the combination of three types of grouping criteria: learner’s personal characteristics, learner’s behaviours, and context information. The instructors can freely select the type, the number, and the weight of grouping criteria, together with other settings such as the number, the size, and the type of learning groups (homogeneous or heterogeneous). Apart from a grouping mechanism, the proposed approach represents a flexible tool to control each learner, and to manage the learning processes from the beginning to the end of collaborative learning activities. In order to evaluate the quality of the implemented group formation algorithm, we compare its Average Intra-cluster Distance (AID) with the one of a random group formation method. The results show a higher effectiveness of the proposed algorithm in forming homogenous and heterogeneous groups compared to the random method.
Resumo:
Security risk management is by definition, a subjective and complex exercise and it takes time to perform properly. Human resources are fundamental assets for any organization, and as any other asset, they have inherent vulnerabilities that need to be handled, i.e. managed and assessed. However, the nature that characterize the human behavior and the organizational environment where they develop their work turn these task extremely difficult, hard to accomplish and prone to errors. Assuming security as a cost, organizations are usually focused on the efficiency of the security mechanisms implemented that enable them to protect against external attacks, disregarding the insider risks, which are much more difficult to assess. All these demands an interdisciplinary approach in order to combine technical solutions with psychology approaches in order to understand the organizational staff and detect any changes in their behaviors and characteristics. This paper intends to discuss some methodological challenges to evaluate the insider threats and its impacts, and integrate them in a security risk framework, that was defined according to the security standard ISO/IEC_JTC1, to support the security risk management process.
Resumo:
Dissertação de mestrado integrado em Psicologia
Resumo:
This work demonstrates the role of defects generated during rapid thermal annealing of pulsed laser deposited ZnO/Al2O3 multilayer nanostructures in presence of vacuum at different temperatures (Ta) (500–900 C) on their electrical conductance and optical characteristics. Photoluminescence (PL) emissions show the stronger green emission at Ta 600 C and violet–blue emission at TaP800 C, and are attributed to oxygen vacancies and zinc related defects (zinc vacancies and interstitials) respectively. Current–voltage (I–V) characteristics of nanostructures with rich oxygen vacancies and zinc related defects display the electroforming free resistive switching (RS) characteristics. Nanostructures with rich oxygen vacancies exhibit conventional and stable RS behavior with high and low resistance states (HRS/LRS) ratio 104 during the retention test. Besides, the dominant conduction mechanism of HRS and LRS is explained by trap-controlled-space-charge limited conduction mechanism, where the oxygen vacancies act as traps. On the other hand, nanostructures with rich zinc related defects show a diode-like RS behavior. The rectifying ratio is found to be sensitive on the zinc interstitials concentration. It is assumed that the rectifying behavior is due to the electrically formed interface layer ZnAl2O4 at the Zn defects rich ZnO crystals – Al2O3 x interface and the switching behavior is attributed to the electron trapping/de-trapping process at zinc vacancies.
Resumo:
The effect of varying separator membrane physical parameters such as degree of porosity, tortuosity and thickness, on battery delivered capacity was studied in order to optimize performance of lithium-ion batteries. This was achieved by a theoretical mathematical model relating the Bruggeman coefficient with the degree of porosity and tortuosity. The inclusion of the separator membrane in the simulation model of the battery system does not affect the delivered capacity of the battery. The ionic conductivity of the separator and consequently the delivered capacity values obtained at different discharge rates depends on the value of the Bruggeman coefficient, which is related with the degree of porosity and tortuosity of the membrane. Independently of scan rate, the optimal value of the degree of porosity is above 50% and the separator thickness should range between 1 μm at 32 μm for improved battery performance.
Resumo:
Poly(vinylidene fluoride), PVDF, films and membranes were prepared by solvent casting from dimethylformamide, DMF, by systematically varying polymer/solvent ratio and solvent evaporation temperature. The effect of the processing conditions on the morphology, degree of porosity, mechanical and thermal properties and crystalline phase of the polymer were evaluated. The obtained microstructure is explained by the Flory-Huggins theory. For the binary system, the porous membrane formation is attributed to a spinodal decomposition of the liquid-liquid phase separation. The morphological features were simulated through the correlation between the Gibbs total free energy and the Flory-Huggins theory. This correlation allowed the calculation of the PVDF/DMF phase diagram and the evolution of the microstructure in different regions of the phase diagram. Varying preparation conditions allow tailoring polymer 2 microstructure while maintaining a high degree of crystallinity and a large β crystalline phase content. Further, the membranes show adequate mechanical properties for applications in filtration or battery separator membranes.
Resumo:
This work presents a study on the properties of chicken plumes from the poultry industry, namely feathers. Studies conducted include length, diameter and weight of chicken feathers to 42 days old. Initial results indicate that the central and lower feather areas present very interesting properties and characteristics turning these materials suitable for various applications, including in clothing, thermal and acoustic insulation in buildings, cimenticious and polymeric matrices reinforcements, among others.