923 resultados para Large Size


Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider a variation of the prototype combinatorial optimization problem known as graph colouring. Our optimization goal is to colour the vertices of a graph with a fixed number of colours, in a way to maximize the number of different colours present in the set of nearest neighbours of each given vertex. This problem, which we pictorially call palette-colouring, has been recently addressed as a basic example of a problem arising in the context of distributed data storage. Even though it has not been proved to be NP-complete, random search algorithms find the problem hard to solve. Heuristics based on a naive belief propagation algorithm are observed to work quite well in certain conditions. In this paper, we build upon the mentioned result, working out the correct belief propagation algorithm, which needs to take into account the many-body nature of the constraints present in this problem. This method improves the naive belief propagation approach at the cost of increased computational effort. We also investigate the emergence of a satisfiable-to-unsatisfiable 'phase transition' as a function of the vertex mean degree, for different ensembles of sparse random graphs in the large size ('thermodynamic') limit.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Code division multiple access (CDMA) in which the spreading code assignment to users contains a random element has recently become a cornerstone of CDMA research. The random element in the construction is particularly attractive as it provides robustness and flexibility in utilizing multiaccess channels, whilst not making significant sacrifices in terms of transmission power. Random codes are generated from some ensemble; here we consider the possibility of combining two standard paradigms, sparsely and densely spread codes, in a single composite code ensemble. The composite code analysis includes a replica symmetric calculation of performance in the large system limit, and investigation of finite systems through a composite belief propagation algorithm. A variety of codes are examined with a focus on the high multi-access interference regime. We demonstrate scenarios both in the large size limit and for finite systems in which the composite code has typical performance exceeding those of sparse and dense codes at equivalent signal to noise ratio.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Integer-valued data envelopment analysis (DEA) with alternative returns to scale technology has been introduced and developed recently by Kuosmanen and Kazemi Matin. The proportionality assumption of their introduced "natural augmentability" axiom in constant and nondecreasing returns to scale technologies makes it possible to achieve feasible decision-making units (DMUs) of arbitrary large size. In many real world applications it is not possible to achieve such production plans since some of the input and output variables are bounded above. In this paper, we extend the axiomatic foundation of integer-valuedDEAmodels for including bounded output variables. Some model variants are achieved by introducing a new axiom of "boundedness" over the selected output variables. A mixed integer linear programming (MILP) formulation is also introduced for computing efficiency scores in the associated production set. © 2011 The Authors. International Transactions in Operational Research © 2011 International Federation of Operational Research Societies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

ABC (ATP-binding-cassette) transporters carry out many vital functions and are involved in numerous diseases, but study of the structure and function of these proteins is often hampered by their large size and membrane location. Membrane protein purification usually utilizes detergents to solubilize the protein from the membrane, effectively removing it from its native lipid environment. Subsequently, lipids have to be added back and detergent removed to reconstitute the protein into a lipid bilayer. In the present study, we present the application of a new methodology for the extraction and purification of ABC transporters without the use of detergent, instead, using a copolymer, SMA (polystyrene-co-maleic acid). SMA inserts into a bilayer and assembles into discrete particles, essentially solubilizing the membrane into small discs of bilayer encircled by a polymer, termed SMALPs (SMA lipid particles). We show that this polymer can extract several eukaryotic ABC transporters, P-glycoprotein (ABCB1), MRP1 (multidrug-resistance protein 1; ABCC1), MRP4 (ABCC4), ABCG2 and CFTR (cystic fibrosis transmembrane conductance regulator; ABCC7), from a range of different expression systems. The SMALP-encapsulated ABC transporters can be purified by affinity chromatography, and are able to bind ligands comparably with those in native membranes or detergent micelles. A greater degree of purity and enhanced stability is seen compared with detergent solubilization. The present study demonstrates that eukaryotic ABC transporters can be extracted and purified without ever being removed from their lipid bilayer environment, opening up awide range of possibilities for the future study of their structure and function. © The Authors Journal compilation © 2014 Biochemical Society.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Synchronous reluctance motors (SynRMs) are gaining in popularity in industrial drives due to their permanent magnet-free, competitive performance, and robust features. This paper studies the power losses in a 90-kW converter-fed SynRM drive by a calorimetric method in comparison of the traditional input-output method. After the converter and the motor were measured simultaneously in separate chambers, the converter was installed inside the large-size chamber next to the motor and the total drive system losses were obtained using one chamber. The uncertainty of both measurement methods is analyzed and discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The re-entrant flow shop scheduling problem (RFSP) is regarded as a NP-hard problem and attracted the attention of both researchers and industry. Current approach attempts to minimize the makespan of RFSP without considering the interdependency between the resource constraints and the re-entrant probability. This paper proposed Multi-level genetic algorithm (GA) by including the co-related re-entrant possibility and production mode in multi-level chromosome encoding. Repair operator is incorporated in the Multi-level genetic algorithm so as to revise the infeasible solution by resolving the resource conflict. With the objective of minimizing the makespan, Multi-level genetic algorithm (GA) is proposed and ANOVA is used to fine tune the parameter setting of GA. The experiment shows that the proposed approach is more effective to find the near-optimal schedule than the simulated annealing algorithm for both small-size problem and large-size problem. © 2013 Published by Elsevier Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Metrics estimate the quality of different aspects of software. In particular, cohesion indicates how well the parts of a system hold together. A metric to evaluate class cohesion is important in object-oriented programming because it gives an indication of a good design of classes. There are several proposals of metrics for class cohesion but they have several problems (for instance, low discrimination). In this paper, a new metric to evaluate class cohesion is proposed, called SCOM, which has several relevant features. It has an intuitive and analytical formulation, what is necessary to apply it to large-size software systems. It is normalized to produce values in the range [0..1], thus yielding meaningful values. It is also more sensitive than those previously reported in the literature. The attributes and methods used to evaluate SCOM are unambiguously stated. SCOM has an analytical threshold, which is a very useful but rare feature in software metrics. We assess the metric with several sample cases, showing that it gives more sensitive values than other well know cohesion metrics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper considers the problem of concept generalization in decision-making systems where such features of real-world databases as large size, incompleteness and inconsistence of the stored information are taken into account. The methods of the rough set theory (like lower and upper approximations, positive regions and reducts) are used for the solving of this problem. The new discretization algorithm of the continuous attributes is proposed. It essentially increases an overall performance of generalization algorithms and can be applied to processing of real value attributes in large data tables. Also the search algorithm of the significant attributes combined with a stage of discretization is developed. It allows avoiding splitting of continuous domains of insignificant attributes into intervals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Analysis of risk measures associated with price series data movements and its predictions are of strategic importance in the financial markets as well as to policy makers in particular for short- and longterm planning for setting up economic growth targets. For example, oilprice risk-management focuses primarily on when and how an organization can best prevent the costly exposure to price risk. Value-at-Risk (VaR) is the commonly practised instrument to measure risk and is evaluated by analysing the negative/positive tail of the probability distributions of the returns (profit or loss). In modelling applications, least-squares estimation (LSE)-based linear regression models are often employed for modeling and analyzing correlated data. These linear models are optimal and perform relatively well under conditions such as errors following normal or approximately normal distributions, being free of large size outliers and satisfying the Gauss-Markov assumptions. However, often in practical situations, the LSE-based linear regression models fail to provide optimal results, for instance, in non-Gaussian situations especially when the errors follow distributions with fat tails and error terms possess a finite variance. This is the situation in case of risk analysis which involves analyzing tail distributions. Thus, applications of the LSE-based regression models may be questioned for appropriateness and may have limited applicability. We have carried out the risk analysis of Iranian crude oil price data based on the Lp-norm regression models and have noted that the LSE-based models do not always perform the best. We discuss results from the L1, L2 and L∞-norm based linear regression models. ACM Computing Classification System (1998): B.1.2, F.1.3, F.2.3, G.3, J.2.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The span of control is the most discussed single concept in classical and modern management theory. In specifying conditions for organizational effectiveness, the span of control has generally been regarded as a critical factor. Existing research work has focused mainly on qualitative methods to analyze this concept, for example heuristic rules based on experiences and/or intuition. This research takes a quantitative approach to this problem and formulates it as a binary integer model, which is used as a tool to study the organizational design issue. This model considers a range of requirements affecting management and supervision of a given set of jobs in a company. These decision variables include allocation of jobs to workers, considering complexity and compatibility of each job with respect to workers, and the requirement of management for planning, execution, training, and control activities in a hierarchical organization. The objective of the model is minimal operations cost, which is the sum of supervision costs at each level of the hierarchy, and the costs of workers assigned to jobs. The model is intended for application in the make-to-order industries as a design tool. It could also be applied to make-to-stock companies as an evaluation tool, to assess the optimality of their current organizational structure. Extensive experiments were conducted to validate the model, to study its behavior, and to evaluate the impact of changing parameters with practical problems. This research proposes a meta-heuristic approach to solving large-size problems, based on the concept of greedy algorithms and the Meta-RaPS algorithm. The proposed heuristic was evaluated with two measures of performance: solution quality and computational speed. The quality is assessed by comparing the obtained objective function value to the one achieved by the optimal solution. The computational efficiency is assessed by comparing the computer time used by the proposed heuristic to the time taken by a commercial software system. Test results show the proposed heuristic procedure generates good solutions in a time-efficient manner.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In fire-dependent forests, managers are interested in predicting the consequences of prescribed burning on postfire tree mortality. We examined the effects of prescribed fire on tree mortality in Florida Keys pine forests, using a factorial design with understory type, season, and year of burn as factors. We also used logistic regression to model the effects of burn season, fire severity, and tree dimensions on individual tree mortality. Despite limited statistical power due to problems in carrying out the full suite of planned experimental burns, associations with tree and fire variables were observed. Post-fire pine tree mortality was negatively correlated with tree size and positively correlated with char height and percent crown scorch. Unlike post-fire mortality, tree mortality associated with storm surge from Hurricane Wilma was greater in the large size classes. Due to their influence on population structure and fuel dynamics, the size-selective mortality patterns following fire and storm surge have practical importance for using fire as a management tool in Florida Keys pinelands in the future, particularly when the threats to their continued existence from tropical storms and sea level rise are expected to increase.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The thesis which follows is a study of recruiting and developing skilled workers for Hotel Food Service Operations in the Miami area. The aim of the study is to bring to the attention of personnel management the role of recruiting and training in providing the skilled people needed for their operation in the short and long run as well. The study was done as a case study of the medium and large size hotels which have a minimum of 250 units each in the Miami area. However, the study has been generalized where it is possible, and when data permitted. The primary data was collected by the use of the questionnaire survey method composed of key questions about recruiting, training and sources of skilled people, turnover reasons, etc. Eight tables have been constructed, analyzed and interpreted. A personal opinion was mentioned in the interpretation of each table's data. It was found that personnel management should provide a better recruiting and developing procedures in order to attract more qualified people, particularly among the youngsters who are potential skilled workers for the future. It was concluded that the quality of work life, the benefits, and the opportunities for advancement in the food and beverage operations play a significant role in an employee's decision to stay with a particular job, and to acquire the necessary skills.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The restructuring process has caused several changes in the workplace since the 1970s in Brazil these changes were more significant during the 1990s, with the implementation of neoliberal policies and the submission of the country's determinations of the IMF and World Bank . In this context, expression wins the increase in structural unemployment and the growth of informality as a mitigating practice the lack of formal employment. At present the activity of mototaxi driver has grown in the municipalities of small, medium and large size of the country. In Caicó / RN, as well as other municipalities, this activity has been presented as an alternative livelihood in the face of rising unemployment. Considering that this is a precarious and risky activity, we wondered about which health conditions of workers in the municipality of mototaxi driver Caicó in the context of job insecurity? What is the perception that this employee has about the health-disease process and its relationship to your work? How to setup the access of motorcycle taxi drivers the right to health and social security? The research sought to examine the health conditions of the workers of the municipality of mototaxi driver Caicó / RN in the context of job insecurity. From the methodological point of view the study worked with documentary research, semi-structured interview and questionnaire with open and closed questions with a sample population of motorcycle taxi drivers of the city, in the period August-September 2013 The results revealed that these workers are if constantly exposed to various risks inherent to the profession as well as the space in which it conducts its business activities, in this case the traffic being traffic accidents and urban violence one of the greatest risks identified by motorcycle taxi drivers in the present study

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The goal of this research was to determine the composition of boron deposits produced by pyrolysis of boron tribromide, and to use the results to (a) determine the experimental conditions (reaction temperature, etc.) necessary to produce alpha-rhombohedral boron and (b) guide the development/refinement of the pyrolysis experiments such that large, high purity crystals of alpha-rhombohedral boron can be produced with consistency. Developing a method for producing large, high purity alpha-rhombohedral boron crystals is of interest because such crystals could potentially be used to achieve an alpha-rhombohedral boron based neutron detector design (a solid-state detector) that could serve as an alternative to existing neutron detector technologies. The supply of neutron detectors in the United States has been hampered for a number of years due to the current shortage of helium-3 (a gas used in many existing neutron detector technologies); the development of alternative neutron detector technology such as an alpha-rhombohedral boron based detector would help provide a more sustainable supply of neutron detectors in this country. In addition, the prospect/concept of an alpha-rhombohedral boron based neutron detector is attractive because it offers the possibility of achieving a design that is smaller, longer life, less power consuming, and potentially more sensitive than existing neutron detectors. The main difficulty associated with creating an alpha-rhombohedral boron based neutron detector is that producing large, high purity crystals of alpha-rhombohedral boron is extremely challenging. Past researchers have successfully made alpha-rhombohedral boron via a number of methods, but no one has developed a method for consistently producing large, high purity crystals. Alpha-rhombohedral boron is difficult to make because it is only stable at temperatures below around 1100-1200 °C, its formation is very sensitive to impurities, and the conditions necessary for its formation are not fully understood or agreed upon in the literature. In this research, the method of pyrolysis of boron tribromide (hydrogen reduction of boron tribromide) was used to deposit boron on a tantalum filament. The goal was to refine this method, or potentially use it in combination with a second method (amorphous boron crystallization), to the point where it is possible to grow large, high purity alpha-rhombohedral boron crystals with consistency. A pyrolysis apparatus was designed and built, and a number of trials were run to determine the conditions (reaction temperature, etc.) necessary for alpha-rhombohedral boron production. This work was focused on the x-ray diffraction analysis of the boron deposits; x-ray diffraction was performed on a number of samples to determine the types of boron (and other compounds) formed in each trial and to guide the choices of test conditions for subsequent trials. It was found that at low reaction temperatures (in the range of around 830-950 °C), amorphous boron was the primary form of boron produced. Reaction temperatures in the range of around 950-1000 °C yielded various combinations of crystalline boron and amorphous boron. In the first trial performed at a temperature of 950 °C, a mix of amorphous boron and alpha-rhombohedral boron was formed. Using a scanning electron microscope, it was possible to see small alpha-rhombohedral boron crystals (on the order of ~1 micron in size) embedded in the surface of the deposit. In subsequent trials carried out at reaction temperatures in the range of 950 °C – 1000 °C, it was found that various combinations of alpha-rhombohedral boron, beta-rhombohedral boron, and amorphous boron were produced; the results tended to be unpredictable (alpha-rhombohedral boron was not produced in every trial), and the factors leading to success/failure were difficult to pinpoint. These results illustrate how sensitive of a process producing alpha-rhombohedral boron can be, and indicate that further improvements to the test apparatus and test conditions (for example, higher purity/cleanliness) may be necessary to optimize the boron deposition. Although alpha-rhombohedral boron crystals of large size were not achieved, this research was successful in (a) developing a pyrolysis apparatus and test procedure that can serve as a platform for future testing, (b) determining reaction temperatures at which alpha-rhombohedral boron can form, and (c) developing a consistent process for analyzing the boron deposits and determining their composition. Further experimentation is necessary to achieve a pyrolysis apparatus and test procedure that can yield large alpha-rhombohedral boron crystals with consistency.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article will review major features of the 'giant' Cape Blanc filament off Mauritania with regard to the transport of chlorophyll and organic carbon from the shelf to the open ocean. Within the filament, chlorophyll is transported about 400 km offshore. Modelled particle distributions along a zonal transect at 21°N showed that particles with a sinking velocity of 5 m d**-1 are advected offshore by up to 600 km in subsurface particle clouds generally located between 400 m and 800 m water depth, forming an Intermediate Nepheloid Layer (INL). It corresponds to the depth of the oxygen minimum zone. Heavier particles with a sinking velocity of 30 m d**-1 are transported from the shelf within the Bottom Layer (BL) of more than 1000 m thickness, largely following the topography of the bottom slope. The particles advected within the BL contribute to the enhanced winter-spring mass fluxes collected at the open-ocean mesotrophic sediment trap site CB-13 (200 nm offshore), due to a long distance advection in deeper waters. The lateral contribution to the deep sediment trap in winter-spring is estimated to be 63% and 72% for organic carbon and total mass, respectively, whereas the lateral input for both components on an annual basis is estimated to be in the order of 15%. Biogenic opal increases almost fivefold from the upper to the lower mesotrophic CB-13 trap, also pointing to an additional source for biogenic silica from eutrophic coastal waters. Blooms obviously sink in smaller, probably mesoscale-sized patches with variable settling rates, depending on the type of aggregated particles and their ballast content. Generally, particle sinking rates are exceptionally high off NW Africa. Very high chlorophyll values and a large size of the Cape Blanc filament in 1998-1999 are also documented in enhanced total mass and organic carbon fluxes. An increasing trend in satellite chlorophyll concentrations and the size of the Cape Blanc filament between 1997 and 2008 as observed for other coastal upwelling areas is not documented.