936 resultados para Vector Space Model
Resumo:
The quantitative structure property relationship (QSPR) for the boiling point (Tb) of polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/Fs) was investigated. The molecular distance-edge vector (MDEV) index was used as the structural descriptor. The quantitative relationship between the MDEV index and Tb was modeled by using multivariate linear regression (MLR) and artificial neural network (ANN), respectively. Leave-one-out cross validation and external validation were carried out to assess the prediction performance of the models developed. For the MLR method, the prediction root mean square relative error (RMSRE) of leave-one-out cross validation and external validation was 1.77 and 1.23, respectively. For the ANN method, the prediction RMSRE of leave-one-out cross validation and external validation was 1.65 and 1.16, respectively. A quantitative relationship between the MDEV index and Tb of PCDD/Fs was demonstrated. Both MLR and ANN are practicable for modeling this relationship. The MLR model and ANN model developed can be used to predict the Tb of PCDD/Fs. Thus, the Tb of each PCDD/F was predicted by the developed models.
Resumo:
A model for predicting temperature evolution for automatic controling systems in manufacturing processes requiring the coiling of bars in the transfer table is presented. Although the method is of a general nature, the presentation in this work refers to the manufacturing of steel plates in hot rolling mills. The predicting strategy is based on a mathematical model of the evolution of temperature in a coiling and uncoiling bar and is presented in the form of a parabolic partial differential equation for a shape changing domain. The mathematical model is solved numerically by a space discretization via geometrically adaptive finite elements which accomodate the change in shape of the domain, using a computationally novel treatment of the resulting thermal contact problem due to coiling. Time is discretized according to a Crank-Nicolson scheme. Since the actual physical process takes less time than the time required by the process controlling computer to solve the full mathematical model, a special predictive device was developed, in the form of a set of least squares polynomials, based on the off-line numerical solution of the mathematical model.
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.
Resumo:
Pilocarpine-induced (320 mg/kg, ip) status epilepticus (SE) in adult (2-3 months) male Wistar rats results in extensive neuronal damage in limbic structures. Here we investigated whether the induction of a second SE (N = 6) would generate damage and cell loss similar to that seen after a first SE (N = 9). Counts of silver-stained (indicative of cell damage) cells, using the Gallyas argyrophil III method, revealed a markedly lower neuronal injury in animals submitted to re-induction of SE compared to rats exposed to a single episode of pilocarpine-induced SE. This effect could be explained as follows: 1) the first SE removes the vulnerable cells, leaving behind resistant cells that are not affected by the second SE; 2) the first SE confers increased resistance to the remaining cells, analogous to the process of ischemic tolerance. Counting of Nissl-stained cells was performed to differentiate between these alternative mechanisms. Our data indicate that different neuronal populations react differently to SE induction. For some brain areas most, if not all, of the vulnerable cells are lost after an initial insult leaving only relatively resistant cells and little space for further damage or cell loss. For some other brain areas, in contrast, our data support the hypothesis that surviving cells might be modified by the initial insult which would confer a sort of excitotoxic tolerance. As a consequence of both mechanisms, subsequent insults after an initial insult result in very little damage regardless of their intensity.
Resumo:
Permanent magnet synchronous machines (PMSM) have become widely used in applications because of high efficiency compared to synchronous machines with exciting winding or to induction motors. This feature of PMSM is achieved through the using the permanent magnets (PM) as the main excitation source. The magnetic properties of the PM have significant influence on all the PMSM characteristics. Recent observations of the PM material properties when used in rotating machines revealed that in all PMSMs the magnets do not necessarily operate in the second quadrant of the demagnetization curve which makes the magnets prone to hysteresis losses. Moreover, still no good analytical approach has not been derived for the magnetic flux density distribution along the PM during the different short circuits faults. The main task of this thesis is to derive simple analytical tool which can predict magnetic flux density distribution along the rotor-surface mounted PM in two cases: during normal operating mode and in the worst moment of time from the PM’s point of view of the three phase symmetrical short circuit. The surface mounted PMSMs were selected because of their prevalence and relatively simple construction. The proposed model is based on the combination of two theories: the theory of the magnetic circuit and space vector theory. The comparison of the results in case of the normal operating mode obtained from finite element software with the results calculated with the proposed model shows good accuracy of model in the parts of the PM which are most of all prone to hysteresis losses. The comparison of the results for three phase symmetrical short circuit revealed significant inaccuracy of the proposed model compared with results from finite element software. The analysis of the inaccuracy reasons was provided. The impact on the model of the Carter factor theory and assumption that air have permeability of the PM were analyzed. The propositions for the further model development are presented.
Resumo:
Overexpression of cytokine-induced apoptosis inhibitor 1 (CIAPIN1) contributes to multidrug resistance (MDR) in breast cancer. This study aimed to evaluate the potential of CIAPIN1 gene silencing by RNA interference (RNAi) as a treatment for drug-resistant breast cancer and to investigate the effect of CIAPIN1 on the drug resistance of breast cancer in vivo. We used lentivirus-vector-based RNAi to knock down CIAPIN1 in nude mice bearing MDR breast cancer tumors and found that lentivirus-vector-mediated silencing of CIAPIN1 could efficiently and significantly inhibit tumor growth when combined with chemotherapy in vivo. Furthermore, Western blot analysis showed that both CIAPIN1 and P-glycoprotein expression were efficiently downregulated, and P53 was upregulated, after RNAi. Therefore, we concluded that lentivirus-vector-mediated RNAi targeting of CIAPIN1 is a potential approach to reverse MDR of breast cancer. In addition, CIAPIN1 may participate in MDR of breast cancer by regulating P-glycoprotein and P53 expression.
Resumo:
SRY-related high-mobility-group box 9 (Sox9) gene is a cartilage-specific transcription factor that plays essential roles in chondrocyte differentiation and cartilage formation. The aim of this study was to investigate the feasibility of genetic delivery of Sox9 to enhance chondrogenic differentiation of human umbilical cord blood-derived mesenchymal stem cells (hUC-MSCs). After they were isolated from human umbilical cord blood within 24 h after delivery of neonates, hUC-MSCs were untreated or transfected with a human Sox9-expressing plasmid or an empty vector. The cells were assessed for morphology and chondrogenic differentiation. The isolated cells with a fibroblast-like morphology in monolayer culture were positive for the MSC markers CD44, CD105, CD73, and CD90, but negative for the differentiation markers CD34, CD45, CD19, CD14, or major histocompatibility complex class II. Sox9 overexpression induced accumulation of sulfated proteoglycans, without altering the cellular morphology. Immunocytochemistry demonstrated that genetic delivery of Sox9 markedly enhanced the expression of aggrecan and type II collagen in hUC-MSCs compared with empty vector-transfected counterparts. Reverse transcription-polymerase chain reaction analysis further confirmed the elevation of aggrecan and type II collagen at the mRNA level in Sox9-transfected cells. Taken together, short-term Sox9 overexpression facilitates chondrogenesis of hUC-MSCs and may thus have potential implications in cartilage tissue engineering.
Resumo:
Product assurance is an essential part of product development process if developers want to ensure that final product is safe and reliable. Product assurance can be supported withrisk management and with different failure analysis methods. Product assurance is emphasized in system development process of mission critical systems. The product assurance process in systems of this kind requires extra attention. Inthis thesis, mission critical systems are space systems and the product assurance processof these systems is presented with help of space standards. The product assurance process can be supported with agile development because agile emphasizes transparency of the process and fast response to changes. Even if the development process of space systems is highly standardized and reminds waterfall model, it is still possible to adapt agile development in space systems development. This thesisaims to support the product assurance process of space systems with agile developmentso that the final product would be as safe and reliable as possible. The main purpose of this thesis is to examine how well product assurance is performed in Finnish space organizations and how product assurance tasks and activities can besupported with agile development. The research part of this thesis is performed in survey form.
Resumo:
Product assurance is an essential part of product development process if developers want to ensure that final product is safe and reliable. Product assurance can be supported with risk management and with different failure analysis methods. Product assurance is emphasized in system development process of mission critical systems. The product assurance process in systems of this kind requires extra attention. In this thesis, mission critical systems are space systems and the product assurance process of these systems is presented with help of space standards. The product assurance process can be supported with agile development because agile emphasizes transparency of the process and fast response to changes. Even if the development process of space systems is highly standardized and reminds waterfall model, it is still possible to adapt agile development in space systems development. This thesis aims to support the product assurance process of space systems with agile development so that the final product would be as safe and reliable as possible. The main purpose of this thesis is to examine how well product assurance is performed in Finnish space organizations and how product assurance tasks and activities can be supported with agile development. The research part of this thesis is performed in survey form.
Resumo:
Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.
Resumo:
Various researches in the field of econophysics has shown that fluid flow have analogous phenomena in financial market behavior, the typical parallelism being delivered between energy in fluids and information on markets. However, the geometry of the manifold on which market dynamics act out their dynamics (corporate space) is not yet known. In this thesis, utilizing a Seven year time series of prices of stocks used to compute S&P500 index on the New York Stock Exchange, we have created local chart to the corporate space with the goal of finding standing waves and other soliton like patterns in the behavior of stock price deviations from the S&P500 index. By first calculating the correlation matrix of normalized stock price deviations from the S&P500 index, we have performed a local singular value decomposition over a set of four different time windows as guides to the nature of patterns that may emerge. I turns out that in almost all cases, each singular vector is essentially determined by relatively small set of companies with big positive or negative weights on that singular vector. Over particular time windows, sometimes these weights are strongly correlated with at least one industrial sector and certain sectors are more prone to fast dynamics whereas others have longer standing waves.
Resumo:
Health Innovation Village at GE is one of the new communities targeted for startup and growth-oriented companies. It has been established at the premises of a multinational conglomerate that will promote networking and growth of startup companies. The concept combines features from traditional business incubators, accelerators, and coworking spaces. This research compares Health Innovation Village to these concepts regarding its goals, target clients, source of income, organization, facilities, management, and success factors. In addition, a new incubator classification model is introduced. On the other hand, Health Innovation Village is examined from its tenants’ perspective and improvements are suggested. The work was implemented as a qualitative case study by interviewing GE staff with connections to Health Innovation Village as well as startup entrepreneurs and employees’ working there. The most evident features of Health Innovation Village correspond to those of business incubators although it is atypical as a non-profit corporate business incubator. Strong network orientation and connections to venture capitalists are common characteristics of these new types of accelerators. The design of the premises conforms to the principles of coworking spaces, but the services provided to the startup companies are considerably more versatile than the services offered by coworking spaces. The advantages of Health Innovation Village are that there are first-class premises and exceptionally good networking possibilities that other types of incubators or accelerators are not able to offer. A conglomerate can also provide multifaceted special knowledge for young firms. In addition, both GE and the startups gained considerable publicity through their cooperation, indeed a characteristic that benefits both parties. Most of the expectations of the entrepreneurs were exceeded. However, communication and the scope of cooperation remain challenges. Micro companies spend their time developing and marketing their products and acquiring financing. Therefore, communication should be as clear as possible and accessible everywhere. The startups would prefer to cooperate significantly more, but few have the time available to assume the responsibility of leadership. The entrepreneurs also expected to have more possibilities for cooperation with GE. Wider collaboration might be accomplished by curation in the same way as it is used in the well-functioning coworking spaces where curators take care of practicalities and promote cooperation. Communication issues could be alleviated if the community had its own Intranet pages where all information could be concentrated. In particular, a common calendar and a room reservation system could be useful. In addition, it could be beneficial to have a section of the Intranet open for both the GE staff and the startups so that those willing to share their knowledge and those having project offers could use it for advertising.
Resumo:
The topic of this thesis is marginaVminority popular music and the question of identity; the term "marginaVminority" specifically refers to members of racial and cultural minorities who are socially and politically marginalized. The thesis argument is that popular music produced by members of cultural and racial minorities establishes cultural identity and resists racist discourse. Three marginaVminority popular music artists and their songs have been chosen for analysis in support of the argument: Gil Scott-Heron's "Gun," Tracy Chapman's "Fast Car" and Robbie Robertson's "Sacrifice." The thesis will draw from two fields of study; popular music and postcolonialism. Within the area of popular music, Theodor Adorno's "Standardization" theory is the focus. Within the area of postcolonialism, this thesis concentrates on two specific topics; 1) Stuart Hall's and Homi Bhabha's overlapping perspectives that identity is a process of cultural signification, and 2) Homi Bhabha's concept of the "Third Space." For Bhabha (1995a), the Third Space defines cultures in the moment of their use, at the moment of their exchange. The idea of identities arising out of cultural struggle suggests that identity is a process as opposed to a fixed center, an enclosed totality. Cultures arise from historical memory and memory has no center. Historical memory is de-centered and thus cultures are also de-centered, they are not enclosed totalities. This is what Bhabha means by "hybridity" of culture - that cultures are not unitary totalities, they are ways of knowing and speaking about a reality that is in constant flux. In this regard, the language of "Otherness" depends on suppressing or marginalizing the productive capacity of culture in the act of enunciation. The Third Space represents a strategy of enunciation that disrupts, interrupts and dislocates the dominant discursive construction of US and THEM, (a construction explained by Hall's concept of binary oppositions, detailed in Chapter 2). Bhabha uses the term "enunciation" as a linguistic metaphor for how cultural differences are articulated through discourse and thus how differences are discursively produced. Like Hall, Bhabha views culture as a process of understanding and of signification because Bhabha sees traditional cultures' struggle against colonizing cultures as transforming them. Adorno's theory of Standardization will be understood as a theoretical position of Western authority. The thesis will argue that Adorno's theory rests on the assumption that there is an "essence" to music, an essence that Adorno rationalizes as structure/form. The thesis will demonstrate that constructing music as possessing an essence is connected to ideology and power and in this regard, Adorno's Standardization theory is a discourse of White Western power. It will be argued that "essentialism" is at the root of Western "rationalization" of music, and that the definition of what constitutes music is an extension of Western racist "discourses" of the Other. The methodological framework of the thesis entails a) applying semiotics to each of the three songs examined and b) also applying Bhabha's model of the Third Space to each of the songs. In this thesis, semiotics specifically refers to Stuart Hall's retheorized semiotics, which recognizes the dual function of semiotics in the analysis of marginal racial/cultural identities, i.e., simultaneously represent embedded racial/cultural stereotypes, and the marginal raciaVcultural first person voice that disavows and thus reinscribes stereotyped identities. (Here, and throughout this thesis, "first person voice" is used not to denote the voice of the songwriter, but rather the collective voice of a marginal racial/cultural group). This dual function fits with Hall's and Bhabha's idea that cultural identity emerges out of cultural antagonism, cultural struggle. Bhabha's Third Space is also applied to each of the songs to show that cultural "struggle" between colonizers and colonized produces cultural hybridities, musically expressed as fusions of styles/sounds. The purpose of combining semiotics and postcolonialism in the three songs to be analyzed is to show that marginal popular music, produced by members of cultural and racial minorities, establishes cultural identity and resists racist discourse by overwriting identities of racial/cultural stereotypes with identities shaped by the first person voice enunciated in the Third Space, to produce identities of cultural hybridities. Semiotic codes of embedded "Black" and "Indian" stereotypes in each song's musical and lyrical text will be read and shown to be overwritten by the semiotic codes of the first person voice, which are decoded with the aid of postcolonial concepts such as "ambivalence," "hybridity" and "enunciation."
Resumo:
This paper studies seemingly unrelated linear models with integrated regressors and stationary errors. By adding leads and lags of the first differences of the regressors and estimating this augmented dynamic regression model by feasible generalized least squares using the long-run covariance matrix, we obtain an efficient estimator of the cointegrating vector that has a limiting mixed normal distribution. Simulation results suggest that this new estimator compares favorably with others already proposed in the literature. We apply these new estimators to the testing of purchasing power parity (PPP) among the G-7 countries. The test based on the efficient estimates rejects the PPP hypothesis for most countries.
Resumo:
The attached file is created with Scientific Workplace Latex