755 resultados para hyperbolic tangent


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work aims to investigate the behavior of fractal and helical elements structures in planar microstrip. In particular, the frequency selective surfaces (FSSs) had changed its conventional elements to fractal and helical formats. The dielectric substrate used was fiberglass (FR-4) and has a thickness of 1.5 mm, a relative permittivity 4.4 and tangent loss equal to 0.02. For FSSs, was adopting the Dürer’s fractal geometry and helical geometry. To make the measurements, we used two antennas horns in direct line of sight, connected by coaxial cable to the vector network analyzer. Some prototypes were select for built and measured. From preliminary results, it was aimed to find practical applications for structures from the cascading between them. For FSSs with Dürer’s fractal elements was observed behavior provided by the multiband fractal geometry, while the bandwidth has become narrow as the level of iteration fractal increased, making it a more selective frequency with a higher quality factor. A parametric analysis allowed the analysis of the variation of the air layer between them. The cascading between fractal elements structure were considered, presented a tri-band behavior for certain values of the layer of air between them, and find applications in the licensed 2.5GHz band (2.3-2.7) and 3.5GHz band (3.3-3.8). For FSSs with helical elements, six structures were considered, namely H0, H1, H2, H3, H4 and H5. The electromagnetic behavior of them was analyzed separately and cascaded. From preliminary results obtained from the separate analysis of structures, including the cascade, the higher the bandwidth, in that the thickness of the air layer increases. In order to find practical applications for helical structures cascaded, the helical elements structure has been cascaded find applications in the X-band (8.0-12.0) and unlicensed band (5.25-5.85). For numerical and experimental characterization of the structures discussed was used, respectively, the commercial software Ansoft Designer and a vector network analyzer, Agilent N5230A model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work shows that the synthesis by combustion is a prominent alternative to obtain ceramic powders of higher oxides, nanostructured and of high purity, as the ferrites of formulas Co(1-x)Zn(x)Fe2O4 e Ni(1-x)Zn(x)Fe2O4 with x ranging from 0.2 mols, in a range from 0.2 ≤ x ≥ 1.0 mol, that presents magnetic properties in coexistence of ferroelectric and ferrimagnetic states, which can be used in antennas of micro tapes and selective surfaces of low frequency in a range of miniaturized microwaves, without performance loss. The obtainment occurred through the combustion process, followed by appropriate physical processes and ordered to the utilization of the substrate sinterization process, it gave us a ceramic material, of high purity degree in a nanometric scale. The Vibrating Sample Magnetometer (VSM) analysis showed that those ferritic materials presents parameters, as materials hysteresis, that have own behavior of magnetic materials of good quality, in which the magnetization states can be suddenly changed with a relatively small variation of the field intensity, having large applications on the electronics field. The X-ray Diffraction (XRD) analysis of the ceramic powders synthesized at 900 °C, characterize its structural and geometrical properties, the crystallite size and the interplanar spacing. Other analysis were developed, as Scanning Electron Microscopy (SEM), X-ray Fluorescence (XRF), electric permittivity and the tangent loss, in high frequencies, through the equipment ZVB - 14 Vector Network Analyzer 10 MHz-14 GHz, of ROHDE & SCHWART.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le th eor eme de Riemann-Roch originale a rme que pour tout morphisme propre f : Y ! X entre vari et es quasi-projectifs lisses sur un corps, et tout el ement a 2 K0(Y ) du groupe de Grothendieck des br es vectoriels on a ch(f!(a)) = f {u100000}Td(Tf ) ch(a) (cf. [BS58]). Ici ch est le caract ere de Chern, Td(Tf ) est la classe de Todd du br e tangent relative et f et f! sont les images directes de l'anneau de Chow et K0 respectivement. Apr es, Baum, Fulton et MacPherson ont d emontr e en [BFM75] le th eor eme de Riemann-Roch pour des morphismes localement intersection compl ete entre des sch emas alg ebriques (sch emas s epar es et localement de type ni sur un corps) projectifs et singuli eres. En [FG83] Fulton et Gillet ont d emontr e le th eor eme sans hypoth eses projectifs. L'extension a la th eorie K sup erieure pour des sch emas r eguli eres sur une base fut d emontr e par Gillet en [Gil81]. Le th eor eme de Riemann-Roch qu'il prouve est pour des morphismes projectifs entre des sch emas lisses et quasi-projectifs. Donc, dans le cas des sch emas sur un corps, le r esultat de Gillet n'inclus pas le th eor eme de [BFM75]. La plus grande g en eralisation du th eor eme de Riemann-Roch que je connais est [D eg14] et [HS15], o u D eglise et Holmstrom-Scholbach obtiennent ind ependamment le th eor eme de Riemann- Roch pour la K-th eorie sup erieure et les morphismes projectifs lic entre sch emas r eguli eres sur une base noetherienne de dimension nie... NOTA 520 8 El teorema de Riemann-Roch original de Grothendieck a rma que para todo mor smo propio f : Y ! X, entre variedades irreducibles quasiproyectivas lisas sobre un cuerpo, y todo elemento a 2 K0(Y ) del grupo de Grothendieck de brados vectoriales se satisface la relaci on ch(f!(a)) = f {u100000}Td(Tf ) ch(a) (cf. [BS58]). Recu erdese que ch denota el car acter de Chern, Td(Tf ) la clase de Todd del brado tangente relativo y f y f! las im agenes directas en el anillo de Chow y K0 respectivamente. M as tarde Baum, Fulton MacPherson probaron en [BFM75] el teorema de Riemann-Roch para mor smos localmente intersecci on completa entre esquemas algebraicos (es decir, esquemas separados localmente de tipo nito sobre cuerpo) proyectivos singulares. En [FG83] Fulton y Gillet probaron el teorema sin hip otesis proyectivas. La notable extensi on a la teor a K superior para esquemas regulares sobre una base fue probada por Gillet en [Gil81]. El teorema de Riemann-Roch all probado es para mor smos proyectivos entre esquemas lisos quasiproyectivos. Sin embargo, obs ervese que en el caso de esquemas sobre cuerpo el resultado de Gillet no recupera el teorema de [BFM75]. La mayor generalizaci on del teorema de Riemann-Roch que yo conozco es [D eg14] y [HS15] donde D eglise y Holmstrom-Scholbach obtuvieron independientemente el teorema de Riemann-Roch para teor a K superior y mor smos proyectivos lic entre esquemas regulares sobre una base noetheriana nito dimensional...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current state of the art techniques for landmine detection in ground penetrating radar (GPR) utilize statistical methods to identify characteristics of a landmine response. This research makes use of 2-D slices of data in which subsurface landmine responses have hyperbolic shapes. Various methods from the field of visual image processing are adapted to the 2-D GPR data, producing superior landmine detection results. This research goes on to develop a physics-based GPR augmentation method motivated by current advances in visual object detection. This GPR specific augmentation is used to mitigate issues caused by insufficient training sets. This work shows that augmentation improves detection performance under training conditions that are normally very difficult. Finally, this work introduces the use of convolutional neural networks as a method to learn feature extraction parameters. These learned convolutional features outperform hand-designed features in GPR detection tasks. This work presents a number of methods, both borrowed from and motivated by the substantial work in visual image processing. The methods developed and presented in this work show an improvement in overall detection performance and introduce a method to improve the robustness of statistical classification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trees and shrubs in tropical Africa use the C3 cycle as a carbon fixation pathway during photosynthesis, while grasses and sedges mostly use the C4 cycle. Leaf-wax lipids from sedimentary archives such as the long-chain n-alkanes (e.g., n-C27 to n-C33) inherit carbon isotope ratios that are representative of the carbon fixation pathway. Therefore, n-alkane d13C values are often used to reconstruct past C3/C4 composition of vegetation, assuming that the relative proportions of C3 and C4 leaf waxes reflect the relative proportions of C3 and C4 plants. We have compared the d13C values of n-alkanes from modern C3 and C4 plants with previously published values from recent lake sediments and provide a framework for estimating the fractional contribution (areal-based) of C3 vegetation cover (fC3) represented by these sedimentary archives. Samples were collected in Cameroon, across a latitudinal transect that accommodates a wide range of climate zones and vegetation types, as reflected in the progressive northward replacement of C3-dominated rain forest by C4-dominated savanna. The C3 plants analysed were characterised by substantially higher abundances of n-C29 alkanes and by substantially lower abundances of n-C33 alkanes than the C4 plants. Furthermore, the sedimentary d13C values of n-C29 and n-C31 alkanes from recent lake sediments in Cameroon (-37.4 per mil to -26.5 per mil) were generally within the range of d13C values for C3 plants, even when from sites where C4 plants dominated the catchment vegetation. In such cases simple linear mixing models fail to accurately reconstruct the relative proportions of C3 and C4 vegetation cover when using the d13C values of sedimentary n-alkanes, overestimating the proportion of C3 vegetation, likely as a consequence of the differences in plant wax production, preservation, transport, and/or deposition between C3 and C4 plants. We therefore tested a set of non-linear binary mixing models using d13C values from both C3 and C4 vegetation as end-members. The non-linear models included a sigmoid function (sine-squared) that describes small variations in the fC3 values as the minimum and maximum d13C values are approached, and a hyperbolic function that takes into account the differences between C3 and C4 plants discussed above. Model fitting and the estimation of uncertainties were completed using the Monte Carlo algorithm and can be improved by future data addition. Models that provided the best fit with the observed d13C values of sedimentary n-alkanes were either hyperbolic functions or a combination of hyperbolic and sine-squared functions. Such non-linear models may be used to convert d13C measurements on sedimentary n-alkanes directly into reconstructions of C3 vegetation cover.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El presente artículo investiga la influencia del fenómeno publicitario como elemento configurador de la esencia urbana en la obra del dibujante y pintor Nicolás Gless. Una obra en la que la publicidad y el diseño gráfico sobrepasan el papel de ornamento para convertirse en rasgo hiperbólico de las sociedades posmodernas. Unos referentes decisivos que, unidos a un profundo conocimiento de los movimientos artísticos del siglo XX y un lenguaje visual cercano al comic, cristalizan en una serie de paisajes urbanos de estilo muy reconocible denominados “paisajes electrográficos”. Un caso paradigmático de las influencias recíprocas entre arte y publicidad vigentes desde mediados del siglo XIX.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding the exploration patterns of foragers in the wild provides fundamental insight into animal behavior. Recent experimental evidence has demonstrated that path lengths (distances between consecutive turns) taken by foragers are well fitted by a power law distribution. Numerous theoretical contributions have posited that “Lévy random walks”—which can produce power law path length distributions—are optimal for memoryless agents searching a sparse reward landscape. It is unclear, however, whether such a strategy is efficient for cognitively complex agents, from wild animals to humans. Here, we developed a model to explain the emergence of apparent power law path length distributions in animals that can learn about their environments. In our model, the agent’s goal during search is to build an internal model of the distribution of rewards in space that takes into account the cost of time to reach distant locations (i.e., temporally discounting rewards). For an agent with such a goal, we find that an optimal model of exploration in fact produces hyperbolic path lengths, which are well approximated by power laws. We then provide support for our model by showing that humans in a laboratory spatial exploration task search space systematically and modify their search patterns under a cost of time. In addition, we find that path length distributions in a large dataset obtained from free-ranging marine vertebrates are well described by our hyperbolic model. Thus, we provide a general theoretical framework for understanding spatial exploration patterns of cognitively complex foragers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding the exploration patterns of foragers in the wild provides fundamental insight into animal behavior. Recent experimental evidence has demonstrated that path lengths (distances between consecutive turns) taken by foragers are well fitted by a power law distribution. Numerous theoretical contributions have posited that “Lévy random walks”—which can produce power law path length distributions—are optimal for memoryless agents searching a sparse reward landscape. It is unclear, however, whether such a strategy is efficient for cognitively complex agents, from wild animals to humans. Here, we developed a model to explain the emergence of apparent power law path length distributions in animals that can learn about their environments. In our model, the agent’s goal during search is to build an internal model of the distribution of rewards in space that takes into account the cost of time to reach distant locations (i.e., temporally discounting rewards). For an agent with such a goal, we find that an optimal model of exploration in fact produces hyperbolic path lengths, which are well approximated by power laws. We then provide support for our model by showing that humans in a laboratory spatial exploration task search space systematically and modify their search patterns under a cost of time. In addition, we find that path length distributions in a large dataset obtained from free-ranging marine vertebrates are well described by our hyperbolic model. Thus, we provide a general theoretical framework for understanding spatial exploration patterns of cognitively complex foragers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study describes the design and characterisation of the rheological and mechanical properties of binary polymeric systems composed of 2-Hydroxypropylcellulose and ɩ-carrageenan, designed as ophthalmic viscoelastic devices (OVDs). Platforms were characterised using dilute solution, flow and oscillatory rheometry and texture profile analysis. Rheological synergy between the two polymers was observed both in the dilute and gel states. All platforms exhibited pseudoplastic flow. Increasing polymer concentrations significantly decreased the loss tangent and rate index yet increased the storage and loss moduli, consistency, gel hardness, compressibility and adhesiveness, the latter being related to the in-vivo retention properties of the platforms. Binary polymeric platforms exhibited unique physicochemical properties, properties that could not be engineered using mono-polymeric platforms. Using characterisation methods that provide information relevant to their clinical performance, low-cost binary platforms (3% hydroxypropylcellulose and either 1% or 2% ɩ-carrageenan) were identified that exhibited rheological, textural and viscoelastic properties advantageous for use as OVDs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Variability management is one of the major challenges in software product line adoption, since it needs to be efficiently managed at various levels of the software product line development process (e.g., requirement analysis, design, implementation, etc.). One of the main challenges within variability management is the handling and effective visualization of large-scale (industry-size) models, which in many projects, can reach the order of thousands, along with the dependency relationships that exist among them. These have raised many concerns regarding the scalability of current variability management tools and techniques and their lack of industrial adoption. To address the scalability issues, this work employed a combination of quantitative and qualitative research methods to identify the reasons behind the limited scalability of existing variability management tools and techniques. In addition to producing a comprehensive catalogue of existing tools, the outcome form this stage helped understand the major limitations of existing tools. Based on the findings, a novel approach was created for managing variability that employed two main principles for supporting scalability. First, the separation-of-concerns principle was employed by creating multiple views of variability models to alleviate information overload. Second, hyperbolic trees were used to visualise models (compared to Euclidian space trees traditionally used). The result was an approach that can represent models encompassing hundreds of variability points and complex relationships. These concepts were demonstrated by implementing them in an existing variability management tool and using it to model a real-life product line with over a thousand variability points. Finally, in order to assess the work, an evaluation framework was designed based on various established usability assessment best practices and standards. The framework was then used with several case studies to benchmark the performance of this work against other existing tools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider a second-order variational problem depending on the covariant acceleration, which is related to the notion of Riemannian cubic polynomials. This problem and the corresponding optimal control problem are described in the context of higher order tangent bundles using geometric tools. The main tool, a presymplectic variant of Pontryagin’s maximum principle, allows us to study the dynamics of the control problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Forecast is the basis for making strategic, tactical and operational business decisions. In financial economics, several techniques have been used to predict the behavior of assets over the past decades.Thus, there are several methods to assist in the task of time series forecasting, however, conventional modeling techniques such as statistical models and those based on theoretical mathematical models have produced unsatisfactory predictions, increasing the number of studies in more advanced methods of prediction. Among these, the Artificial Neural Networks (ANN) are a relatively new and promising method for predicting business that shows a technique that has caused much interest in the financial environment and has been used successfully in a wide variety of financial modeling systems applications, in many cases proving its superiority over the statistical models ARIMA-GARCH. In this context, this study aimed to examine whether the ANNs are a more appropriate method for predicting the behavior of Indices in Capital Markets than the traditional methods of time series analysis. For this purpose we developed an quantitative study, from financial economic indices, and developed two models of RNA-type feedfoward supervised learning, whose structures consisted of 20 data in the input layer, 90 neurons in one hidden layer and one given as the output layer (Ibovespa). These models used backpropagation, an input activation function based on the tangent sigmoid and a linear output function. Since the aim of analyzing the adherence of the Method of Artificial Neural Networks to carry out predictions of the Ibovespa, we chose to perform this analysis by comparing results between this and Time Series Predictive Model GARCH, developing a GARCH model (1.1).Once applied both methods (ANN and GARCH) we conducted the results' analysis by comparing the results of the forecast with the historical data and by studying the forecast errors by the MSE, RMSE, MAE, Standard Deviation, the Theil's U and forecasting encompassing tests. It was found that the models developed by means of ANNs had lower MSE, RMSE and MAE than the GARCH (1,1) model and Theil U test indicated that the three models have smaller errors than those of a naïve forecast. Although the ANN based on returns have lower precision indicator values than those of ANN based on prices, the forecast encompassing test rejected the hypothesis that this model is better than that, indicating that the ANN models have a similar level of accuracy . It was concluded that for the data series studied the ANN models show a more appropriate Ibovespa forecasting than the traditional models of time series, represented by the GARCH model

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The evaluation of relativistic spin networks plays a fundamental role in the Barrett-Crane state sum model of Lorentzian quantum gravity in 4 dimensions. A relativistic spin network is a graph labelled by unitary irreducible representations of the Lorentz group appearing in the direct integral decomposition of the space of L^2 functions on three-dimensional hyperbolic space. To `evaluate' such a spin network we must do an integral; if this integral converges we say the spin network is `integrable'. Here we show that a large class of relativistic spin networks are integrable, including any whose underlying graph is the 4-simplex (the complete graph on 5 vertices). This proves a conjecture of Barrett and Crane, whose validity is required for the convergence of their state sum model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Immersions of an m-manifold in an n-manifold, n>m, are classified up to regular homotopy by the homotopy classes of sections of a vector bundle E associated to the tangent bundle of M.  When N = Rn , the fibre of E is the Stiefel manifold of m-frames in n-space.