16 resultados para size-selection
em Universidade do Minho
Resumo:
Novel input modalities such as touch, tangibles or gestures try to exploit human's innate skills rather than imposing new learning processes. However, despite the recent boom of different natural interaction paradigms, it hasn't been systematically evaluated how these interfaces influence a user's performance or whether each interface could be more or less appropriate when it comes to: 1) different age groups; and 2) different basic operations, as data selection, insertion or manipulation. This work presents the first step of an exploratory evaluation about whether or not the users' performance is indeed influenced by the different interfaces. The key point is to understand how different interaction paradigms affect specific target-audiences (children, adults and older adults) when dealing with a selection task. 60 participants took part in this study to assess how different interfaces may influence the interaction of specific groups of users with regard to their age. Four input modalities were used to perform a selection task and the methodology was based on usability testing (speed, accuracy and user preference). The study suggests a statistically significant difference between mean selection times for each group of users, and also raises new issues regarding the “old” mouse input versus the “new” input modalities.
Resumo:
Tese de Doutoramento - Leaders for Technical Industries (LTI) - MIT Portugal
Resumo:
Tri-layered and bi-layered magnetoelectric (ME) flexible composite structures of varying geometries and sizes consisting on magnetostrictive Vitrovac and piezoelectric poly(vinylidene fluoride) (PVDF) layers were fabricated by direct bonding. From the ME measurements it was determined that tri-layered composites structures (magnetostrictive-piezoelectric-magnetostrictive type), show a higher ME response (75 V.cm-1.Oe-1) than the bi-layer structure (66 V.cm 1.Oe-1). The ME voltage coefficient decreased with increasing longitudinal size aspect ratio between PVDF and Vitrovac layers (from 1.1 to 4.3), being observed a maximum ME voltage coefficient of 66 V.cm-1.Oe-1. It was also observed that the composite with the lowest transversal aspect ratio between PVDF and Vitrovac layers resulted in better ME performance than the structures with higher transversal size aspect ratios. It was further determined an intimate relation between the Vitrovac PVDF Area Area ratio and the ME response of the composites. When such ratio values approach 1, the ME response is the largest. Additionally the ME output value and magnetic field response was controlled by changing the number of Vitrovac layers, which allows the development of magnetic sensors and energy harvesting devices.
Resumo:
Nowadays, the sustainability of buildings has an extreme importance. This concept goes towards the European aims of the Program Horizon 2020, which concerns about the reduction of the environmental impacts through such aspects as the energy efficiency and renewable technologies, among others. Sustainability is an extremely broad concept but, in this work, it is intended to include the concept of sustainability in buildings. Within the concept that aims the integration of environmental, social and economic levels towards the preservation of the planet and the integrity of the users, there are, currently, several types of tools of environmental certification that are applicable to the construction industry (LEED, BREEAM, DGNB, SBTool, among others). Within this context, it is highlighted the tool SBTool (Sustainable Building Tool) that is employed in several countries and can be subject to review in institutions of basic education, which are the base for the formation of the critical masses and for the development of a country. The main aim of this research is to select indicators that can be used in a methodology for sustainability assessment (SBTool) of school buildings in Portugal and in Brazil. In order to achieve it, it will also be analyzed other methodologies that already incorporate parameters directly related with the schools environment, such as BREEAM or LEED.
Resumo:
This Letter reports a measurement of the exclusive γγ→ℓ+ℓ−(ℓ=e,μ) cross-section in proton--proton collisions at a centre-of-mass energy of 7 TeV by the ATLAS experiment at the LHC, based on an integrated luminosity of 4.6 fb−1. For the electron or muon pairs satisfying exclusive selection criteria, a fit to the dilepton acoplanarity distribution is used to extract the fiducial cross-sections. The cross-section in the electron channel is determined to be σexcl.γγ→e+e−=0.428±0.035(stat.)±0.018(syst.) pb for a phase-space region with invariant mass of the electron pairs greater than 24 GeV, in which both electrons have transverse momentum pT>12 GeV and pseudorapidity |η|<2.4. For muon pairs with invariant mass greater than 20 GeV, muon transverse momentum pT>10 GeV and pseudorapidity |η|<2.4, the cross-section is determined to be σexcl.γγ→μ+μ−=0.628±0.032(stat.)±0.021(syst.) pb. When proton absorptive effects due to the finite size of the proton are taken into account in the theory calculation, the measured cross-sections are found to be consistent with the theory prediction.
Resumo:
The immune system can recognize virtually any antigen, yet T cell responses against several pathogens, including Mycobacterium tuberculosis, are restricted to a limited number of immunodominant epitopes. The host factors that affect immunodominance are incompletely understood. Whether immunodominant epitopes elicit protective CD8+ T cell responses or instead act as decoys to subvert immunity and allow pathogens to establish chronic infection is unknown. Here we show that anatomically distinct human granulomas contain clonally expanded CD8+ T cells with overlapping T cell receptor (TCR) repertoires. Similarly, the murine CD8+ T cell response against M. tuberculosis is dominated by TB10.44-11-specific T cells with extreme TCRß bias. Using a retro genic model of TB10.44-11-specific CD8+ Tcells, we show that TCR dominance can arise because of competition between clonotypes driven by differences in affinity. Finally, we demonstrate that TB10.4-specific CD8+ T cells mediate protection against tuberculosis, which requires interferon-? production and TAP1-dependent antigen presentation in vivo. Our study of how immunodominance, biased TCR repertoires, and protection are inter-related, provides a new way to measure the quality of T cell immunity, which if applied to vaccine evaluation, could enhance our understanding of how to elicit protective T cell immunity.
Resumo:
The currently available clinical imaging methods do not provide highly detailed information about location and severity of axonal injury or the expected recovery time of patients with traumatic brain injury [1]. High-Definition Fiber Tractography (HDFT) is a novel imaging modality that allows visualizing and quantifying, directly, the degree of axons damage, predicting functional deficits due to traumatic axonal injury and loss of cortical projections. This imaging modality is based on diffusion technology [2]. The inexistence of a phantom able to mimic properly the human brain hinders the possibility of testing, calibrating and validating these medical imaging techniques. Most research done in this area fails in key points, such as the size limit reproduced of the brain fibers and the quick and easy reproducibility of phantoms [3]. For that reason, it is necessary to develop similar structures matching the micron scale of axon tubes. Flexible textiles can play an important role since they allow producing controlled packing densities and crossing structures that match closely the human crossing patterns of the brain. To build a brain phantom, several parameters must be taken into account in what concerns to the materials selection, like hydrophobicity, density and fiber diameter, since these factors influence directly the values of fractional anisotropy. Fiber cross-section shape is other important parameter. Earlier studies showed that synthetic fibrous materials are a good choice for building a brain phantom [4]. The present work is integrated in a broader project that aims to develop a brain phantom made by fibrous materials to validate and calibrate HDFT. Due to the similarity between thousands of hollow multifilaments in a fibrous arrangement, like a yarn, and the axons, low twist polypropylene multifilament yarns were selected for this development. In this sense, extruded hollow filaments were analysed in scanning electron microscope to characterize their main dimensions and shape. In order to approximate the dimensional scale to human axons, five types of polypropylene yarns with different linear density (denier) were used, aiming to understand the effect of linear density on the filament inner and outer areas. Moreover, in order to achieve the required dimensions, the polypropylene filaments cross-section was diminished in a drawing stage of a filament extrusion line. Subsequently, tensile tests were performed to characterize the mechanical behaviour of hollow filaments and to evaluate the differences between stretched and non-stretched filaments. In general, an increase of the linear density causes the increase in the size of the filament cross section. With the increase of structure orientation of filaments, induced by stretching, breaking tenacity increases and elongation at break decreases. The production of hollow fibers, with the required characteristics, is one of the key steps to create a brain phantom that properly mimics the human brain that may be used for the validation and calibration of HDFT, an imaging approach that is expected to contribute significantly to the areas of brain related research.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
A high-resolution mtDNA phylogenetic tree allowed us to look backward in time to investigate purifying selection. Purifying selection was very strong in the last 2,500 years, continuously eliminating pathogenic mutations back until the end of the Younger Dryas (∼11,000 years ago), when a large population expansion likely relaxed selection pressure. This was preceded by a phase of stable selection until another relaxation occurred in the out-of-Africa migration. Demography and selection are closely related: expansions led to relaxation of selection and higher pathogenicity mutations significantly decreased the growth of descendants. The only detectible positive selection was the recurrence of highly pathogenic nonsynonymous mutations (m.3394T>C-m.3397A>G-m.3398T>C) at interior branches of the tree, preventing the formation of a dinucleotide STR (TATATA) in the MT-ND1 gene. At the most recent time scale in 124 mother-children transmissions, purifying selection was detectable through the loss of mtDNA variants with high predicted pathogenicity. A few haplogroup-defining sites were also heteroplasmic, agreeing with a significant propensity in 349 positions in the phylogenetic tree to revert back to the ancestral variant. This nonrandom mutation property explains the observation of heteroplasmic mutations at some haplogroup-defining sites in sequencing datasets, which may not indicate poor quality as has been claimed.
Resumo:
Distributed data aggregation is an important task, allowing the de- centralized determination of meaningful global properties, that can then be used to direct the execution of other applications. The resulting val- ues result from the distributed computation of functions like count, sum and average. Some application examples can found to determine the network size, total storage capacity, average load, majorities and many others. In the last decade, many di erent approaches have been pro- posed, with di erent trade-o s in terms of accuracy, reliability, message and time complexity. Due to the considerable amount and variety of ag- gregation algorithms, it can be di cult and time consuming to determine which techniques will be more appropriate to use in speci c settings, jus- tifying the existence of a survey to aid in this task. This work reviews the state of the art on distributed data aggregation algorithms, providing three main contributions. First, it formally de nes the concept of aggrega- tion, characterizing the di erent types of aggregation functions. Second, it succinctly describes the main aggregation techniques, organizing them in a taxonomy. Finally, it provides some guidelines toward the selection and use of the most relevant techniques, summarizing their principal characteristics.
Resumo:
The aim of this paper is to predict time series of SO2 concentrations emitted by coal-fired power stations in order to estimate in advance emission episodes and analyze the influence of some meteorological variables in the prediction. An emission episode is said to occur when the series of bi-hourly means of SO2 is greater than a specific level. For coal-fired power stations it is essential to predict emission epi- sodes sufficiently in advance so appropriate preventive measures can be taken. We proposed a meth- odology to predict SO2 emission episodes based on using an additive model and an algorithm for variable selection. The methodology was applied to the estimation of SO2 emissions registered in sampling lo- cations near a coal-fired power station located in Northern Spain. The results obtained indicate a good performance of the model considering only two terms of the time series and that the inclusion of the meteorological variables in the model is not significant.
Resumo:
The suitability of a total-length-based, minimum capture-size and different protection regimes was investigated for the gooseneck barnacle Pollicipes pollicipes shellfishery in N Spain. For this analysis, individuals that were collected from 10 sites under different fishery protection regimes (permanently open, seasonally closed, and permanently closed) were used. First, we applied a non-parametric regression model to explore the relationship between the capitulum Rostro-Tergum (RT) size and the Total Length (TL). Important heteroskedastic disturbances were detected for this relationship, demon- strating a high variability of TL with respect to RT. This result substantiates the unsuitability of a TL-based minimum size by means of a mathematical model. Due to these disturbances, an alternative growth- based minimum capture size of 26.3 mm RT (23 mm RC) was estimated using the first derivative of a Kernel-based non-parametric regression model for the relationship between RT and dry weight. For this purpose, data from the permanently protected area were used to avoid bias due to the fishery. Second, the size-frequency distribution similarity was computed using a MDS analysis for the studied sites to evaluate the effectiveness of the protection regimes. The results of this analysis indicated a positive effect of the permanent protection, while the effect of the seasonal closure was not detected. This result needs to be interpreted with caution because the current harvesting based on a potentially unsuitable mini- mum capture size may dampen the efficacy of the seasonal protection regime.
Resumo:
The selection of spawning habitat of a population of Octopus vulgaris that is subject to a small-scale exploitation was studied in the Cíes Islands within the National Park of the Atlantic Islands of Galicia (NW Spain). The technique used was visual censuses by scuba diving. We conducted 93 visual censuses from April 2012 to April 2014. The total swept area was 123.69 ha. Habitat features (season, depth, zone, bottom temperature, swept area, bottom substrate type, and creels fishing impact) were evaluated as predictors of the presence/absence of spawning dens using GAM models. O. vulgaris has a noteworthy preference for spawning in areas with hard bottom substrate and moderate depth (approximately 20 m). The higher density of spawning dens (1.08ha−1) was found in a surveyed area of 50.14ha located in the northeastern part of the northern Cíes Island. We propose to protect the area comprised from Punta Escodelo to Punta Ferreiro between 5 and 30 m depth. This area has a surface of 158 ha equivalent to 5.98% of the total marine area of the Cíes islands. The strengths and weaknesses of a management strategy based on the protection of the species’ spawning habitat are discussed.
Resumo:
Dissertação de mestrado em Comunicação, Arte e Cultura
Resumo:
Lipid nanoballoons integrating multiple emulsions of the type water-in-oil-in-water enclose, at least in theory, a biomimetic aqueous-core suitable for housing hydrophilic biomolecules such as proteins, peptides and bacteriophage particles. The research effort entertained in this paper reports a full statistical 23x31 factorial design study (three variables at two levels and one variable at three levels) to optimize biomimetic aqueous-core lipid nanoballoons for housing hydrophilic protein entities. The concentrations of protein, lipophilic and hydrophilic emulsifiers, and homogenization speed were set as the four independent variables, whereas the mean particle hydrodynamic size (HS), zeta potential (ZP) and polydispersity index (PI) were set as the dependent variables. The V23x31 factorial design constructed led to optimization of the higher (+1) and lower (-1) levels, with triplicate testing for the central (0) level, thus producing thirty three experiments and leading to selection of the optimized processing parameters as 0.015% (w/w) protein entity, 0.75% (w/w) lipophilic emulsifier (soybean lecithin) and 0.50% (w/w) hydrophilic emulsifier (poloxamer 188). In the present research effort, statistical optimization and production of protein derivatives encompassing full stabilization of their three-dimensional structure, has been attempted via housing said molecular entities within biomimetic aqueous-core lipid nanoballoons integrating a multiple (W/O/W) emulsion.