926 resultados para Scaling Of Chf


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we study several natural and man-made complex phenomena in the perspective of dynamical systems. For each class of phenomena, the system outputs are time-series records obtained in identical conditions. The time-series are viewed as manifestations of the system behavior and are processed for analyzing the system dynamics. First, we use the Fourier transform to process the data and we approximate the amplitude spectra by means of power law functions. We interpret the power law parameters as a phenomenological signature of the system dynamics. Second, we adopt the techniques of non-hierarchical clustering and multidimensional scaling to visualize hidden relationships between the complex phenomena. Third, we propose a vector field based analogy to interpret the patterns unveiled by the PL parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last 40 years of the world economy are analyzed by means of computer visualization methods. Multidimensional scaling and the hierarchical clustering tree techniques are used. The current Western downturn in favor of Asian partners may still be reversed in the coming decades.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmospheric temperatures characterize Earth as a slow dynamics spatiotemporal system, revealing long-memory and complex behavior. Temperature time series of 54 worldwide geographic locations are considered as representative of the Earth weather dynamics. These data are then interpreted as the time evolution of a set of state space variables describing a complex system. The data are analyzed by means of multidimensional scaling (MDS), and the fractional state space portrait (fSSP). A centennial perspective covering the period from 1910 to 2012 allows MDS to identify similarities among different Earth’s locations. The multivariate mutual information is proposed to determine the “optimal” order of the time derivative for the fSSP representation. The fSSP emerges as a valuable alternative for visualizing system dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis submitted to the Instituto Superior de Estatística e Gestão de Informação da Universidade Nova de Lisboa in partial fulfillment of the requirements for the Degree of Doctor of Philosophy in Information Management – Geographic Information Systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historical renders are exposed to several degradation processes that can lead to a wide range of anomalies,such as scaling, detachments, and pulverization. Among the common anomalies, the loss of cohesion and of adhesion are usually identified as the most difficult to repair; these anomalies still need to be deeply studied to design compatible, durable, and sustainable conservation treatments. The restitution of render cohesion can be achieved using consolidating products. Nevertheless, repair treatments could induce aesthetic alterations, and, therefore, are usually followed by chromatic reintegration. This work aims to study the effectiveness of mineral products as consolidants for lime-based mortars and simultaneously as chromatic treatments for pigmented renders. The studied consolidating products are prepared by mixing air lime,metakaolin, water, and mineral pigments. The idea of these consolidating and coloring products rises from a traditional lime-based technique, the limewash, widely diffused in southern Europe and in the Mediterranean area. Consolidating products were applied and tested on lime-based mortar specimens with a low binder–aggregate ratio and therefore with reduced cohesion. A physico-mechanical, microstructural, and mineralogical characterization was performed on untreated and treated specimens, in order to evaluate the efficacy and durability of the treatments. Accelerated aging tests were also performed to assess consolidant durability, when subjected to aggressive conditions. Results showed that the consolidants tested are compatible, effective, and possess good durability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of devices based on heterostructured thin films of biomolecules conveys a huge contribution on biomedical field. However, to achieve high efficiency of these devices, the storage of water molecules into these heterostructures, in order to maintain the biological molecules hydrated, is mandatory. Such hydrated environment may be achieved with lipids molecules which have the ability to rearrange spontaneously into vesicles creating a stable barrier between two aqueous compartments. Yet it is necessary to find conditions that lead to the immobilization of whole vesicles on the heterostructures. In this work, the conditions that govern the deposition of open and closed liposomes of 1.2-dipalmitoyl-sn-Glycero-3-[Phospho-rac-(1-glycerol)] (sodium Salt) (DPPG) onto polyelectrolytes cushions prepared by the layer-by-layer (LbL) method were analyzed. Electronic transitions of DPPG molecules as well as absorption coefficients were obtained by vacuum ultraviolet spectroscopy, while the elemental composition of the heterostructures was characterized by x-ray photoelectron spectroscopy (XPS). The presence of water molecules in the films was inferred by XPS and infrared spectroscopy. Quartz crystal microbalance (QCM) data analysis allowed to conclude that, in certain cases, the DPPG adsorbed amount is dependent of the bilayers number already adsorbed. Moreover, the adsorption kinetics curves of both adsorbed amount and surface roughness allowed to determine the kinetics parameters that are related with adsorption processes namely, electrostatic forces, liposomes diffusion and lipids re-organization on surface. Scaling exponents attained from atomic force microscopy images statistical analysis demonstrate that DPPG vesicles adsorption mechanism is ruled by the diffusion Villain model confirming that adsorption is governed by electrostatic forces. The power spectral density treatment enabled a thorough description of the accessible surface of the samples as well as of its inner structural properties. These outcomes proved that surface roughness influences the adsorption of DPPG liposomes onto surfaces covered by a polyelectrolyte layer. Thus, low roughness was shown to induce liposome rupture creating a lipid bilayer while high roughness allows the adsorption of whole liposomes. In addition, the fraction of open liposomes calculated from the normalized maximum adsorbed amounts decreases with the cushion roughness increase, allowing us to conclude that the surface roughness is a crucial variable that governs the adsorption of open or whole liposomes. This conclusion is fundamental for the development of well-designed sensors based on functional biomolecules incorporated in liposomes. Indeed, LbL films composed of polyelectrolytes and liposomes with and without melanin encapsulated were successfully applied to sensors of olive oil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research paper investigates how the market conditions at the base of the pyramid (BOP) influences South African small and medium sized enterprises (SMEs) to take certain business decisions in the townships and rural areas. It takes a qualitative approach to explore how SMEs with social objectives develops mitigating strategies to successfully engage with and in poor communities. The research suggests that prevailing BOP strategies are lacking certain aspects to successfully realize them on the ground. It advices firms to take a more practical hands-on approach to identify a sustainable business model by testing, experimenting, learning and adjusting, eventually being eligible for up-scaling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines modern economic growth according to the multidimensional scaling (MDS) method and state space portrait (SSP) analysis. Electing GDP per capita as the main indicator for economic growth and prosperity, the long-run perspective from 1870 to 2010 identifies the main similarities among 34 world partners’ modern economic growth and exemplifies the historical waving mechanics of the largest world economy, the USA. MDS reveals two main clusters among the European countries and their old offshore territories, and SSP identifies the Great Depression as a mild challenge to the American global performance, when compared to the Second World War and the 2008 crisis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current computer systems have evolved from featuring only a single processing unit and limited RAM, in the order of kilobytes or few megabytes, to include several multicore processors, o↵ering in the order of several tens of concurrent execution contexts, and have main memory in the order of several tens to hundreds of gigabytes. This allows to keep all data of many applications in the main memory, leading to the development of inmemory databases. Compared to disk-backed databases, in-memory databases (IMDBs) are expected to provide better performance by incurring in less I/O overhead. In this dissertation, we present a scalability study of two general purpose IMDBs on multicore systems. The results show that current general purpose IMDBs do not scale on multicores, due to contention among threads running concurrent transactions. In this work, we explore di↵erent direction to overcome the scalability issues of IMDBs in multicores, while enforcing strong isolation semantics. First, we present a solution that requires no modification to either database systems or to the applications, called MacroDB. MacroDB replicates the database among several engines, using a master-slave replication scheme, where update transactions execute on the master, while read-only transactions execute on slaves. This reduces contention, allowing MacroDB to o↵er scalable performance under read-only workloads, while updateintensive workloads su↵er from performance loss, when compared to the standalone engine. Second, we delve into the database engine and identify the concurrency control mechanism used by the storage sub-component as a scalability bottleneck. We then propose a new locking scheme that allows the removal of such mechanisms from the storage sub-component. This modification o↵ers performance improvement under all workloads, when compared to the standalone engine, while scalability is limited to read-only workloads. Next we addressed the scalability limitations for update-intensive workloads, and propose the reduction of locking granularity from the table level to the attribute level. This further improved performance for intensive and moderate update workloads, at a slight cost for read-only workloads. Scalability is limited to intensive-read and read-only workloads. Finally, we investigate the impact applications have on the performance of database systems, by studying how operation order inside transactions influences the database performance. We then propose a Read before Write (RbW) interaction pattern, under which transaction perform all read operations before executing write operations. The RbW pattern allowed TPC-C to achieve scalable performance on our modified engine for all workloads. Additionally, the RbW pattern allowed our modified engine to achieve scalable performance on multicores, almost up to the total number of cores, while enforcing strong isolation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The year is 2015 and the startup and tech business ecosphere has never seen more activity. In New York City alone, the tech startup industry is on track to amass $8 billion dollars in total funding – the highest in 7 years (CB Insights, 2015). According to the Kauffman Index of Entrepreneurship (2015), this figure represents just 20% of the total funding in the United States. Thanks to platforms that link entrepreneurs with investors, there are simply more funding opportunities than ever, and funding can be initiated in a variety of ways (angel investors, venture capital firms, crowdfunding). And yet, in spite of all this, according to Forbes Magazine (2015), nine of ten startups will fail. Because of the unpredictable nature of the modern tech industry, it is difficult to pinpoint exactly why 90% of startups fail – but the general consensus amongst top tech executives is that “startups make products that no one wants” (Fortune, 2014). In 2011, author Eric Ries wrote a book called The Lean Startup in attempts to solve this all-too-familiar problem. It was in this book where he developed the framework for The Hypothesis-Driven Entrepreneurship Process, an iterative process that aims at proving a market before actually launching a product. Ries discusses concepts such as the Minimum Variable Product, the smallest set of activities necessary to disprove a hypothesis (or business model characteristic). Ries encourages acting briefly and often: if you are to fail, then fail fast. In today’s fast-moving economy, an entrepreneur cannot afford to waste his own time, nor his customer’s time. The purpose of this thesis is to conduct an in-depth of analysis of Hypothesis-Driven Entrepreneurship Process, in order to test market viability of a reallife startup idea, ShowMeAround. This analysis will follow the scientific Lean Startup approach; for the purpose of developing a functional business model and business plan. The objective is to conclude with an investment-ready startup idea, backed by rigorous entrepreneurial study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic Engineering (TE) approaches are increasingly impor- tant in network management to allow an optimized configuration and resource allocation. In link-state routing, the task of setting appropriate weights to the links is both an important and a challenging optimization task. A number of different approaches has been put forward towards this aim, including the successful use of Evolutionary Algorithms (EAs). In this context, this work addresses the evaluation of three distinct EAs, a single and two multi-objective EAs, in two tasks related to weight setting optimization towards optimal intra-domain routing, knowing the network topology and aggregated traffic demands and seeking to mini- mize network congestion. In both tasks, the optimization considers sce- narios where there is a dynamic alteration in the state of the system, in the first considering changes in the traffic demand matrices and in the latter considering the possibility of link failures. The methods will, thus, need to simultaneously optimize for both conditions, the normal and the altered one, following a preventive TE approach towards robust configurations. Since this can be formulated as a bi-objective function, the use of multi-objective EAs, such as SPEA2 and NSGA-II, came nat- urally, being those compared to a single-objective EA. The results show a remarkable behavior of NSGA-II in all proposed tasks scaling well for harder instances, and thus presenting itself as the most promising option for TE in these scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we concentrate on modelling gross primary productivity using two simple approaches to simulate canopy photosynthesis: "big leaf" and "sun/shade" models. Two approaches for calibration are used: scaling up of canopy photosynthetic parameters from the leaf to the canopy level and fitting canopy biochemistry to eddy covariance fluxes. Validation of the models is achieved by using eddy covariance data from the LBA site C14. Comparing the performance of both models we conclude that numerically (in terms of goodness of fit) and qualitatively, (in terms of residual response to different environmental variables) sun/shade does a better job. Compared to the sun/shade model, the big leaf model shows a lower goodness of fit and fails to respond to variations in the diffuse fraction, also having skewed responses to temperature and VPD. The separate treatment of sun and shade leaves in combination with the separation of the incoming light into direct beam and diffuse make sun/shade a strong modelling tool that catches more of the observed variability in canopy fluxes as measured by eddy covariance. In conclusion, the sun/shade approach is a relatively simple and effective tool for modelling photosynthetic carbon uptake that could be easily included in many terrestrial carbon models.