908 resultados para probability and reinforcement proportion
Resumo:
Este estudo investiga a otimização da resistência ao cisalhamento no plano de juntas de sobreposição co-curadas do compósito termoplástico unidirecional auto-reforçado de polietileno de baixa densidade reciclado reforçado por fibras de polietileno de ultra alto peso molecular através da relação desta resistência com os parâmetros processuais de prensagem a quente para a conformação da junta (pressão, temperatura, tempo e comprimento). A matriz teve sua estrutura química analisada para verificar potenciais degradações devidas à sua origem de reciclagem. Matriz e reforço foram caracterizados termicamente para definir a janela de temperatura de processamento de junta a ser estudada. A elaboração das condições de cura dos corpos de prova foi feita de acordo com a metodologia de Projeto de Experimento de Superfície de Resposta e a relação entre a resistência ao cisalhamento das juntas e os respectivos parâmetros de cura foi obtida através de equação de regressão gerada pelo método dos Mínimos Quadrados Ordinários. A caracterização mecânica em tração do material foi analisada micro e macromecanicamente. A análise química da matriz não demonstrou a presença de grupos carboxílicos que evidenciassem degradação por ramificações de cadeia e reticulação advindos da reciclagem do material. As metodologias de ensaio propostas demonstraram ser eficazes, podendo servir como base para a constituição de normas técnicas. Demonstrou-se que é possível obter juntas com resistência ótima ao cisalhamento de 6,88 MPa quando processadas a 1 bar, 115°C, 5 min e com 12 mm. A análise da fratura revelou que a ruptura por cisalhamento das juntas foi precedida por múltiplas fissuras longitudinais induzidas por sucessivos debondings, tanto dentro quanto fora da junta, devido à tensão transversal acumulada na mesma, proporcional a seu comprimento. A temperatura demonstrou ser o parâmetro de processamento mais relevante para a performance da junta, a qual é pouco afetada por variações na pressão e tempo de cura.
Resumo:
This thesis consists of three essays on information economics. I explore how information is strategically communicated or designed by senders who aim to influence the decisions of a receiver. In the first chapter, I study a cheap talk game between two imperfectly informed experts and a decision maker. The experts receive noisy signals about the state and sequentially communicate the relevant information to the decision maker. I refine the self-serving belief system under uncertainty and Ι characterise the most informative equilibrium that might arise in such environments.In the second chapter, I consider the case where a decision maker seeks advice from a biased expert who cares also about establishing a reputation of being competent. The expert has the incentives to misreport her information but she faces a trade-off between the gain from misrepresentation and the potential reputation loss. I show that the equilibrium is fully-revealing if the expert is not too biased and not too highly reputable. If there is competition between two experts the information transmission is always improved. However, in cases where the experts are more than two the result is ambiguous, and it depends on the players’ prior belief over states.In the last chapter, I consider a model of strategic communication where a privately and imperfectly informed sender can persuade a receiver. The sender may receive favorable or unfavorable private information about her preferred state. I describe two ways that are adopted in real life situations and theoretically improve equilibrium informativeness given sender's private information. First, a policy that suggests symmetry constraints to the experiments' choice. Second, an approval strategy characterised by a low precision threshold where the receiver will accept the sender with a positive probability and a higher one where the sender will be accepted with certainty.
Resumo:
Massive Internet of Things is expected to play a crucial role in Beyond 5G (B5G) wireless communication systems, offering seamless connectivity among heterogeneous devices without human intervention. However, the exponential proliferation of smart devices and IoT networks, relying solely on terrestrial networks, may not fully meet the demanding IoT requirements in terms of bandwidth and connectivity, especially in areas where terrestrial infrastructures are not economically viable. To unleash the full potential of 5G and B5G networks and enable seamless connectivity everywhere, the 3GPP envisions the integration of Non-Terrestrial Networks (NTNs) into the terrestrial ones starting from Release 17. However, this integration process requires modifications to the 5G standard to ensure reliable communications despite typical satellite channel impairments. In this framework, this thesis aims at proposing techniques at the Physical and Medium Access Control layers that require minimal adaptations in the current NB-IoT standard via NTN. Thus, firstly the satellite impairments are evaluated and, then, a detailed link budget analysis is provided. Following, analyses at the link and the system levels are conducted. In the former case, a novel algorithm leveraging time-frequency analysis is proposed to detect orthogonal preambles and estimate the signals’ arrival time. Besides, the effects of collisions on the detection probability and Bit Error Rate are investigated and Non-Orthogonal Multiple Access approaches are proposed in the random access and data phases. The system analysis evaluates the performance of random access in case of congestion. Various access parameters are tested in different satellite scenarios, and the performance is measured in terms of access probability and time required to complete the procedure. Finally, a heuristic algorithm is proposed to jointly design the access and data phases, determining the number of satellite passages, the Random Access Periodicity, and the number of uplink repetitions that maximize the system's spectral efficiency.
Resumo:
Fiber reinforced polymer composites have been widely applied in the aeronautical field. However, composite processing, which uses unlocked molds, should be avoided in view of the tight requirements and also due to possible environmental contamination. To produce high performance structural frames meeting aeronautical reproducibility and low cost criteria, the Brazilian industry has shown interest to investigate the resin transfer molding process (RTM) considering being a closed-mold pressure injection system which allows faster gel and cure times. Due to the fibrous composite anisotropic and non homogeneity characteristics, the fatigue behavior is a complex phenomenon quite different from to metals materials crucial to be investigated considering the aeronautical application. Fatigue sub-scale specimens of intermediate modulus carbon fiber non-crimp multi-axial reinforcement and epoxy mono-component system composite were produced according to the ASTM 3039 D. Axial fatigue tests were carried out according to ASTM D 3479. A sinusoidal load of 10 Hz frequency and load ratio R = 0.1. It was observed a high fatigue interval obtained for NCF/RTM6 composites. Weibull statistical analysis was applied to describe the failure probability of materials under cyclic loads and fractures pattern was observed by scanning electron microscopy. (C) 2010 Published by Elsevier Ltd.
Resumo:
Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.
Resumo:
Background: This paper is a commentary to a debate article entitled: "Are we overpathologizing everyday life? A tenable blueprint for behavioral addiction research", by Billieux et al. (2015). Methods and aim: This brief response focused on the necessity to better characterize psychological and related neurocognitive determinants of persistent deleterious actions associated or not with substance utilization. Results: A majority of addicted people could be driven by psychological functional reasons to keep using drugs, gambling or buying despite the growing number of related negative consequences. In addition, a non-negligible proportion of them would need assistance to restore profound disturbances in basic learning processes involved in compulsive actions. Conclusions: The distinction between psychological functionality and compulsive aspects of addictive behaviors should represent a big step towards more efficient treatments.
Resumo:
Amphibians have been declining worldwide and the comprehension of the threats that they face could be improved by using mark-recapture models to estimate vital rates of natural populations. Recently, the consequences of marking amphibians have been under discussion and the effects of toe clipping on survival are debatable, although it is still the most common technique for individually identifying amphibians. The passive integrated transponder (PIT tag) is an alternative technique, but comparisons among marking techniques in free-ranging populations are still lacking. We compared these two marking techniques using mark-recapture models to estimate apparent survival and recapture probability of a neotropical population of the blacksmith tree frog, Hypsiboas faber. We tested the effects of marking technique and number of toe pads removed while controlling for sex. Survival was similar among groups, although slightly decreased from individuals with one toe pad removed, to individuals with two and three toe pads removed, and finally to PIT-tagged individuals. No sex differences were detected. Recapture probability slightly increased with the number of toe pads removed and was the lowest for PIT-tagged individuals. Sex was an important predictor for recapture probability, with males being nearly five times more likely to be recaptured. Potential negative effects of both techniques may include reduced locomotion and high stress levels. We recommend the use of covariates in models to better understand the effects of marking techniques on frogs. Accounting for the effect of the technique on the results should be considered, because most techniques may reduce survival. Based on our results, but also on logistical and cost issues associated with PIT tagging, we suggest the use of toe clipping with anurans like the blacksmith tree frog.
Resumo:
Aims: Surgical staple line dehiscence usually leads to severe complications. Several techniques and materials have been used to reinforce this stapling and thus reduce the related complications. The objective was to compare safety of two types of anastomotic reinforcement in open gastric bypass. Methods: A prospective, randomized study comparing an extraluminal suture, fibrin glue, and a nonpermanent buttressing material, Seamguard (R), for staple line reinforcement. Fibrin glue was excluded from the study and analysis after two leaks, requiring surgical reintervention, antibiotic therapy, and prolonged patient hospitalization. Results: Twenty patients were assigned to the suture and Seamguard reinforcement groups. The groups were similar in terms of preoperative characteristics. No staple line dehiscence occurred in the two groups, whereas two cases of dehiscence occurred in the fibrin glue group. No mortality occurred and surgical time was statistically similar for both techniques. Seamguard made the surgery more expensive. Conclusion: In our service, staple line reinforcement in open bariatric surgery with oversewing or Seamguard was considered to be safe. Seamguard application was considered to be easier than oversewing, but more expensive.
Resumo:
The structure of probability currents is studied for the dynamical network after consecutive contraction on two-state, nonequilibrium lattice systems. This procedure allows us to investigate the transition rates between configurations on small clusters and highlights some relevant effects of lattice symmetries on the elementary transitions that are responsible for entropy production. A method is suggested to estimate the entropy production for different levels of approximations (cluster sizes) as demonstrated in the two-dimensional contact process with mutation.
Resumo:
Several studies using vegetable fibers as the exclusive reinforcement in fiber-cement composites have shown acceptable mechanical performance at the first ages. However, after the exposure to accelerated aging tests, these composites have shown significant reduction in the toughness or increase in embrittlement. This was mainly attributed to the improved fiber-matrix adhesion and fiber mineralization after aging process. The objective of the present research was to evaluate composites produced by the slurry dewatering technique followed by pressing and air curing, reinforced with combinations of polypropylene fibers and sisal kraft pulp at different pulp freeness. The physical properties, mechanical performance, and microstructural characteristics of the composites were evaluated before and after accelerated and natural aging. Results showed the great contribution of pulp refinement on the improvement of the mechanical strength in the composites. Higher intensities of refinement resulted in higher modulus of rupture for the composites with hybrid reinforcement after accelerated and natural aging. The more compact microstructure was due to the improved packing of the mineral particles with refined sisal pulp. The toughness of the composites after aging was maintained in relation to the composites at 28 days of cure.
Resumo:
In a sample of censored survival times, the presence of an immune proportion of individuals who are not subject to death, failure or relapse, may be indicated by a relatively high number of individuals with large censored survival times. In this paper the generalized log-gamma model is modified for the possibility that long-term survivors may be present in the data. The model attempts to separately estimate the effects of covariates on the surviving fraction, that is, the proportion of the population for which the event never occurs. The logistic function is used for the regression model of the surviving fraction. Inference for the model parameters is considered via maximum likelihood. Some influence methods, such as the local influence and total local influence of an individual are derived, analyzed and discussed. Finally, a data set from the medical area is analyzed under the log-gamma generalized mixture model. A residual analysis is performed in order to select an appropriate model.
Resumo:
Joint generalized linear models and double generalized linear models (DGLMs) were designed to model outcomes for which the variability can be explained using factors and/or covariates. When such factors operate, the usual normal regression models, which inherently exhibit constant variance, will under-represent variation in the data and hence may lead to erroneous inferences. For count and proportion data, such noise factors can generate a so-called overdispersion effect, and the use of binomial and Poisson models underestimates the variability and, consequently, incorrectly indicate significant effects. In this manuscript, we propose a DGLM from a Bayesian perspective, focusing on the case of proportion data, where the overdispersion can be modeled using a random effect that depends on some noise factors. The posterior joint density function was sampled using Monte Carlo Markov Chain algorithms, allowing inferences over the model parameters. An application to a data set on apple tissue culture is presented, for which it is shown that the Bayesian approach is quite feasible, even when limited prior information is available, thereby generating valuable insight for the researcher about its experimental results.
Resumo:
Two experiments were conducted on the nature of expert perception in the sport of squash. In the first experiment, ten expert and fifteen novice players attempted to predict the direction and force of squash strokes from either a film display (occluded at variable time periods before and after the opposing player had struck the ball) or a matched point-light display (containing only the basic kinematic features of the opponent's movement pattern). Experts outperformed the novices under both display conditions, and the same basic time windows that characterised expert and novice pick-up of information in the film task also persisted in the point-light task. This suggests that the experts' perceptual advantage is directly related to their superior pick-up of essential kinematic information. In the second experiment, the vision of six expert and six less skilled players was occluded by remotely triggered liquid-crystal spectacles at quasi-random intervals during simulated match play. Players were required to complete their current stroke even when the display was occluded and their prediction performance was assessed with respect to whether they moved to the correct half of the court to match the direction and depth of the opponent's stroke. Consistent with experiment 1, experts were found to be superior in their advance pick-up of both directional and depth information when the display was occluded during the opponent's hitting action. However, experts also remained better than chance, and clearly superior to less skilled players, in their prediction performance under conditions where occlusion occurred before any significant pre-contact preparatory movement by the opposing player was visible. This additional source of expert superiority is attributable to their superior attunement to the information contained in the situational probabilities and sequential dependences within their opponent's pattern of play.
Resumo:
Phytophthora root rot (Phytophthora medicaginis) and colletotrichum crown rot (Colletotrichum trifoli) are the 2 most serious pathogens of lucerne in eastern Australia. Work reported in this paper shows that in glasshouse tests of the 11 most commonly grown Australian lucerne cultivars, the proportion of individual plants with resistance to both pathogens ranges from 0 (Hunter River and Aurora) through to a maximum of 19.8% (Sequel HR). Within 9 of the cultivars, the proportion of individual plants resistant to the 2 pathogens was <7%. Since these 2 diseases are known to cause serious losses in eastern Australia, the results indicate further improvement in lucerne production can be obtained by increasing the proportion of individual plants in a cultivar resistant to both pathogens. This would be best achieved by identifying dominant sources of resistance and incorporating this into on-going lucerne breeding programs.
Resumo:
Background and aim of the study: Results of valve re-replacement (reoperation) in 898 patients undergoing aortic valve replacement with cryopreserved homograft valves between 1975 and 1998 are reported. The study aim was to provide estimates of unconditional probability of valve reoperation and cumulative incidence function (actual risk) of reoperation. Methods: Valves were implanted by subcoronary insertion (n = 500), inclusion cylinder (n = 46), and aortic root replacement (n = 352). Probability of reoperation was estimated by adopting a mixture model framework within which estimates were adjusted for two risk factors: patient age at initial replacement, and implantation technique. Results: For a patient aged 50 years, the probability of reoperation in his/her lifetime was estimated as 44% and 56% for non-root and root replacement techniques, respectively. For a patient aged 70 years, estimated probability of reoperation was 16% and 25%, respectively. Given that a reoperation is required, patients with non-root replacement have a higher hazard rate than those with root replacement (hazards ratio = 1.4), indicating that non-root replacement patients tend to undergo reoperation earlier before death than root replacement patients. Conclusion: Younger patient age and root versus non-root replacement are risk factors for reoperation. Valve durability is much less in younger patients, while root replacement patients appear more likely to live longer and hence are more likely to require reoperation.