Get PDF Cuentos Completos. Mauricio Magdaleno (Spanish Edition)

Free download. Book file PDF easily for everyone and every device. You can download and read online Cuentos Completos. Mauricio Magdaleno (Spanish Edition) file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with Cuentos Completos. Mauricio Magdaleno (Spanish Edition) book. Happy reading Cuentos Completos. Mauricio Magdaleno (Spanish Edition) Bookeveryone. Download file Free Book PDF Cuentos Completos. Mauricio Magdaleno (Spanish Edition) at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF Cuentos Completos. Mauricio Magdaleno (Spanish Edition) Pocket Guide.

The large sample distribution on the estimated moment function is also obtained. These results are used to discuss the situation when the moment conditions hold but the model is misspecified. It also is shown that the overidentifying restrictions test has asymptotic power one whenever the limit moment function is different from zero. It is also proved that the bootstrap distributions converge almost surely to the previously mentioned distributions and hence they could be used as an alternative to draw inferences under misspecification.

Interestingly, it is also shown that bootstrap can be reliably applied even if the number of bootstrap replications is very small. It is well known that outliers or faulty observations affect the analysis of unreplicated factorial experiments. This work proposes a method that combines the rank transformation of the observations, the Daniel plot and a formal statistical testing procedure to assess the significance of the effects.

It is shown, by means of previous theoretical results cited in the literature, examples and a Monte Carlo study, that the approach is helpful in the presence of outlying observations. The simulation study includes an ample set of alternative procedures that have been published in the literature to detect significant effects in unreplicated experiments. The Monte Carlo study also, gives evidence that using the rank transformation as proposed, provides two advantages: keeps control of the experimentwise error rate and improves the relative power to detect active factors in the presence of outlying observations.

Most of the inferential results are based on the assumption that the user has a "random" sample, by this it is usually understood that the observations are a realization from a set of independent identically distributed random variables. However most of the time this is not true mainly for two reasons: one, the data are not obtained by means of a probabilistic sampling scheme from the population, the data are just gathered as they becomes available or in the best of the cases using some kind of control variables and quota sampling. For an excellent discussion about the kind of considerations that should be made in the first situation see Hahn and Meeker and a related comment in Aguirre For the second problem there is a book about the topic in Skinner et a1.

In this paper we consider the problem of evaluating the effect of sampling complexity on Pearson's Chi-square and other alternative tests for goodness of fit for proportions. Out of this work come up several adjustments to Pearson's test, namely: Wald type tests, average eigenvalue correction and Satterthwaite type correction. There is a more recent and general resampling approach given in Sitter , but it was not pursued in this study.

Sometimes data analysis using the usual parametric techniques produces misleading results due to violations of the underlying assumptions, such as outliers or non-constant variances. In particular, this could happen in unreplicated factorial or fractional factorial experiments. To help in this situation alternative analyses have been proposed. For example Box and Meyer give a Bayesian analysis allowing for possibly faulty observations in un replicated factorials and the well known Box-Cox transformation can be used when there is a change in dispersion.

This paper presents an analysis based on the rank transformation that deals with the above problems. The analysis is simple to use and can be implemented with a general purpose statistical computer package. The procedure is illustrated with examples from the literature. A theoretical justification is outlined at the end of the paper. The article considers the problem of choosing between two possibly nonlinear models that have been fitted to the same data using M-estimation methods. An asymptotically normally distributed lest statistics using a Monte Carlo study.

We found that the presence of a competitive model either in the null or the alternative hypothesis affects the distributional properties of the tests, and that in the case that the data contains outlying observations the new procedure had a significantly higher power that the rest of the test.

Fuller , Anderson , and Hannan introduce infinite moving average models as the limit in the quadratic mean of a sequence of partial sums, and Fuller shows that if the assumption of independence of the addends is made then the limit almost surely holds. This note shows that without the assumption of independence, the limit holds with probability one. Moreover, the proofs given here are easier to teach. A test for the problem or choosing between several nonnested nonlinear regression models simultaneously is presented. The test does not require an explicit specification of a parametric family of distributions for the error term and has a closed form.

The asymptotic dislribution of the generalized Cox test for choosing between two multivariate, nonlinear regression models in implicit form is derived. The data is assumed to be generated by a model that need not be either the null or the non-null model. Some investigations of these characteristics are included. The idea is to replace an analytical computation of the expectation of the Cox difference with a bootstrap estimate.

In many Solvency and Basel loss data, there are thresholds or deductibles that affect the analysis capability. On the other hand, the Birnbaum-Saunders model has received great attention during the last two decades and it can be used as a loss distribution. In this paper, we propose a solution to the problem of deductibles using a truncated version of the Birnbaum-Saunders distribution.

The probability density function, cumulative distribution function, and moments of this distribution are obtained. In addition, properties regularly used in insurance industry, such as multiplication by a constant inflation effect and reciprocal transformation, are discussed. Furthermore, a study of the behavior of the risk rate and of risk measures is carried out. Moreover, estimation aspects are also considered in this work. Finally, an application based on real loss data from a commercial bank is conducted.

This paper proposes two new estimators for determining the number of factors r in static approximate factor models. We exploit the well-known fact that the r largest eigenvalues of the variance matrix of N response variables grow unboundedly as N increases, while the other eigenvalues remain bounded. The new estimators are obtained simply by maximizing the ratio of two adjacent eigenvalues. Our simulation results provide promising evidence for the two estimators. We study a modification of the Luce rule for stochastic choice which admits the possibility of zero probabilities.

In any given menu, the decision maker uses the Luce rule on a consideration set, potentially a strict subset of the menu. Without imposing any structure on how the consideration sets are formed, we characterize the resulting behavior using a single axiom. Our result offers insight into special cases where consideration sets are formed under various restrictions.

Purpose— This paper summarizes the findings of a research project aimed at benchmarking the environmental sustainability practices of the top Mexican companies. The survey also explored how the adoption of environmental sustainability practices relates to the competitiveness of these firms. Findings— The results suggest that Mexican companies are very active in the various areas of business where environmental sustainability is relevant. Because the manufacturing sector is significantly overrepresented in the sample and because of its importance in addressing issues of environmental sustainability, when appropriate, specific results for this sector are reported and contrasted to the overall sample.

Practical implications— The vast majority of these firms see adopting environmental sustainability practices as being profitable and think this will be even more important in the future. In Mexico, one might expect that the same would be true, but only anecdotal evidence was heretofore available. We derive optimal consumption and portfolio policies that are robust to uncertainty about the hard-to-estimate drift rate, jump intensity and jump size parameters. We also provide a semi-closed form formula for the detection-error probability and compare various portfolio holding strategies, including robust and non-robust policies.

Our quantitative analysis shows that ignoring uncertainty leads to significant wealth loss for the investor. We exploit the manifold increase in homicides in —11 in Mexico resulting from its war on organized drug traffickers to estimate the effect of drug-related homicides on housing prices. We use an unusually rich data set that provides national coverage of housing prices and homicides and exploits within-municipality variations. We find that the impact of violence on housing prices is borne entirely by the poor sectors of the population.

An increase in homicides equivalent to 1 standard deviation leads to a 3 percent decrease in the price of low-income housing. This paper examines foreign direct investment FDI in the Hungarian economy in the period of post-Communist transition since Hungary took a quite aggressive approach in welcoming foreign investment during this period and as a result had the highest per capita FDI in the region as of We discuss the impact of FDI in terms of strategic intent, i.


  • Rattled (A Nicholas Colt Short Story)?
  • Dead Guys Talk: A Wild Willie Mystery (Wild Willie Mysteries)?
  • Leña verde!
  • Lesson Plans F Is for Fugitive;
  • HSI Acquisitions 2003?
  • Max Aub: Agencia Literaria Carmen Balcells.
  • She is_1 She loves comics and soccer (Japanese Edition)?

The effect of these two kinds of FDI is contrasted by examining the impact of resource seeking FDI in manufacturing sectors and market serving FDI in service industries. In the case of transition economies, we argue that due to the strategic intent, resource seeking FDI can imply a short-term impact on economic development whereas market serving FDI strategically implies a long-term presence with increased benefits for the economic development of a transition economy. Our focus is that of market serving FDI in the Hungarian banking sector, which has brought improved service and products to multinational and Hungarian firms.

This has been accompanied by the introduction of innovative financial products to the Hungarian consumer, in particular consumer credit including mortgage financing. However, the latter remains an underserved segment with much growth potential.

Synonyms and antonyms of chomite in the Spanish dictionary of synonyms

For public policy in Hungary and other transition economies, we conclude that policymakers should consider the strategic intent of FDI in order to maximize its benefits in their economies. We propose a general framework for extracting rotation invariant features from images for the tasks of image analysis and classification. Our framework is inspired in the form of the Zernike set of orthogonal functions. It provides a way to use a set of one-dimensional functions to form an orthogonal set over the unit disk by non-linearly scaling its domain, and then associating it an exponential term.

When the images are projected into the subspace created with the proposed framework, the rotations in the image affect only the exponential term while the value of the orthogonal functions serve as rotation invariant features. We exemplify our framework using the Haar wavelet functions to extract features from several thousand images of symbols. We then use the features in an OCR experiment to demonstrate the robustness of the method. In this paper we explore the use of orthogonal functions as generators of representative, compact descriptors of image content.

In Image Analysis and Pattern Recognition such descriptors are referred to as image features, and there are some useful properties they should possess such as rotation invariance and the capacity to identify different instances of one class of images. We exemplify our algorithmic methodology using the family of Daubechies wavelets, since they form an orthogonal function set. We benchmark the quality of the image features generated by doing a comparative OCR experiment with three different sets of image features.

Our algorithm can use a wide variety of orthogonal functions to generate rotation invariant features, thus providing the flexibility to identify sets of image features that are best suited for the recognition of different classes of images. When analyzing catastrophic risk, traditional measures for evaluating risk, such as the probable maximum loss PML , value at risk VaR , tail-VaR, and others, can become practically impossible to obtain analytically in certain types of insurance, such as earthquake, and certain types of reinsurance arrangements, specially non-proportional with reinstatements.

Given the available information, it can be very difficult for an insurer to measure its risk exposure. This effect can be assessed mathematically. The PML is defined in terms of a very extreme quantile. The resulting reinsurance structures will then be very complicated to analyze and to evaluate their mitigation or transfer effects analytically, so it may be necessary to use alternative approaches, such as Monte Carlo simulation methods.

This is what we do in this paper in order to measure the effect of a complex reinsurance treaty on the risk profile of an insurance company. We compute the pure risk premium, PML as well as a host of results: impact on the insured portfolio, risk transfer effect of reinsurance programs, proportion of times reinsurance is exhausted, percentage of years it was necessary to use the contractual reinstatements, etc. Since the estimators of quantiles are known to be biased, we explore the alternative of using an Extreme Value approach to complement the analysis. The need to estimate future claims has led to the development of many loss reserving techniques.

There are two important problems that must be dealt with in the process of estimating reserves for outstanding claims: one is to determine an appropriate model for the claims process, and the other is to assess the degree of correlation among claim payments in different calendar and origin years. We approach both problems here. On the one hand we use a gamma distribution to model the claims process and, in addition, we allow the claims to be correlated.

We follow a Bayesian approach for making inference with vague prior distributions. The methodology is illustrated with a real data set and compared with other standard methods. Consider a random sample X1, X2,. Only the sample size, mean and range are recorded and it is necessary to estimate the unknown population mean and standard deviation. In this paper the estimation of the mean and standard deviation is made from a Bayesian perspective by using a Markov Chain Monte Carlo MCMC algorithm to simulate samples from the intractable joint posterior distribution of the mean and standard deviation.

The proposed methodology is applied to simulated and real data. This paper is concerned with the situation that occurs in claims reserving when there are negative values in the development triangle of incremental claim amounts. Typically these negative values will be the result of salvage recoveries, payments from third parties, total or partial cancellation of outstanding claims due to initial overestimation of the loss or to a possible favorable jury decision in favor of the insurer, rejection by the insurer, or just plain errors.


  • Spanish American Books in 1948?
  • Publicaciones de la facultad?
  • El resplandor!
  • test-biblio | ITAM!
  • Fondo de Cultura Económica - Wikiwand?

Some of the traditional methods of claims reserving, such as the chain-ladder technique, may produce estimates of the reserves even when there are negative values. Historically the chain-ladder method has been used as a gold standard benchmark because of its generalized use and ease of application. This paper presents a Bayesian model to consider negative incremental values, based on a three-parameter log-normal distribution. The model presented here allows the actuary to provide point estimates and measures of dispersion, as well as the complete distribution for outstanding claims from which the reserves can be derived.

It is concluded that the method has a clear advantage over other existing methods. The BMOM is particularly useful for obtaining post-data moments and densities for parameters and future observations when the form of the likelihood function is unknown and thus a traditional Bayesian approach cannot be used.

Also, even when the form of the likelihood is assumed known, in time series problems it is sometimes difficult to formulate an appropriate prior density. Here, we show how the BMOM approach can be used in two, nontraditional problems. The first one is conditional forecasting in regression and time series autoregressive models. Specifically, it is shown that when forecasting disaggregated data say quarterly data and given aggregate constraints say in terms of annual data it is possible to apply a Bayesian approach to derive conditional forecasts in the multiple regression model.

The types of constraints conditioning usually considered are that the sum, or the average, of the forecasts equals a given value. This kind of condition can be applied to forecasting quarterly values whose sum must be equal to a given annual value. Analogous results are obtained for AR p models. The second problem we analyse is the issue of aggregation and disaggregation of data in relation to predictive precision and modelling. Predictive densities are derived for future aggregate values by means of the BMOM based on a model for disaggregated data.

They are then compared with those derived based on aggregated data. En este trabajo se analiza el problema en el contexto de muestreo por conglomerados. Se presenta un estimador puntual y uno para la varianza del total. The problem of estimating the accumulated value of a positive and continuous variable for which some partially accumulated data has been observed, and usually with only a small number of observations two years , can be approached taking advantage of the existence of stable seasonality from one period to another.

For example the quantity to be predicted may be the total for a period year and it needs to be made as soon as partial information becomes available for given subperiods months. These conditions appear in a natural way in the prediction of seasonal sales of style goods, such as toys; in the behavior of inventories of goods where demand varies seasonally, such as fuels; or banking deposits, among many other examples. In this paper, the problem is addressed within a cluster sampling framework.

A ratio estimator is proposed for the total value to be forecasted under the assumption of stable seasonality. Estimators are obtained for both the point forecast and the variance. The procedure works well when standard methods cannot be applied due to the reduced number of observations. Some real examples are included as well as applications to some previously published data.

Comparisons are made with other procedures. We present a Bayesian solution to forecasting a time series when few observations are available. The quantity to predict is the accumulated value of a positive, continuous variable when partially accumulated data are observed. These conditions appear naturally in predicting sales of style goods and coupon redemption.

A simple model describes the relation between partial and total values, assuming stable seasonality. Exact analytic results are obtained for point forecasts and the posterior predictive distribution.

Noninformative priors allow automatic implementation. Examples are provided. We give a brief description of the Project and characteristics of the target population. We then describe and use the FGT Index to determine if the communities included in the Project were correctly chosen. We describe the method of cost-effectiveness analysis used in this article. The procedure for specifying cost-effectiveness ratios is next presented, and their application to measure the impact of PNAS on Food Expenditures carried out.

Finally we present empirical results that show that, among other results, PNAS increased Food Expenditures of the participating households by 7. The evidence is mostly qualitative, however, since there are no methods for measuring this participation quantitatively. In this paper we present a procedure for generating an aggregate index of community participation based on productivity.

It is specifically aimed at measuring community participation in the construction of works for collective benefit. Because there are limitations on the information available, additional assumptions must be made to estimate parameters. The method is applied to data from communities in Mexico participating in a national nutrition, food and health program. A Bayesian approach is used to derive constrained and unconstrained forecasts in an autoregressive time series model. Both are obtained by formulating an AR p model in such a way that it is possible to compute numerically the predictive distribution for any number of forecasts.

The types of constraints considered are that a linear combination of the forecasts equals a given value. This kind of restriction is applied to forecasting quarterly values whose sum must be equal to a given annual value. Constrained forecasts are generated by conditioning on the predictive distribution of unconstrained forecasts. The problem of temporal disaggregation of time series is analyzed by means of Bayesian methods. The disaggregated values are obtained through a posterior distribution derived by using a diffuse prior on the parameters.

Further analysis is carried out assuming alternative conjugate priors. The means of the different posterior distribution are shown to be equivalent to some sampling theory results. Bayesian prediction intervals are obtained. Forecasts for future disaggregated values are derived assuming a conjugate prior for the future aggregated value. A formulation of the problem of detecting outliers as an empirical Bayes problem is studied. In so doing we encounter a non-standard empirical Bayes problem for which the notion of average risk asymptotic optimality a.

Some general theorems giving sufficient conditions for a. These general results are then used in various formulations of the outlier problem for underlying normal distributions to give a. Rates of convergence results are also given using the methods of Johns and Van Ryzin.

This article examines the distinctive characteristics and features of how both women and men speak. Based on this analysis, the author will make an assessment, and then invite the reader to become aware of their manner of speaking. En el presente trabajo, estudiamos los espacios de Brown, que son conexos y no completamente de Hausdorff. Escribimos algunas consecuencias de este resultado. Esto generaliza un resultado probado por Kirch en In the present paper we study Brown spaces which are connected and not completely Hausdorff. We also show that some elements of BG are Brown spaces, while others are totally separated.

We write some consequences of such result. For example, the space N, TG is not connected "im kleinen" at each of its points. This generalizes a result proved by Kirchin We also present a simpler proof of a result given by Szczuka in Morsi, A. In recent years has increased interest in the development of new materials in this case composites, as these more advanced materials can perform their work better than conventional materials.

In the present work we analyze the effect of the addition of carbon nanotubes incorporating nano silver particles to increase both their electrical and mechanical properties. The obtained alloys were characterized by Scanning Electron Microscopy SEM , X-Ray Diffraction Analysis, hardness tests were performed and electrical conductivity tests were finally carried out. The salts were placed in the inlet to promote corrosion and increase the chemical reaction.

These salts were applied to the alloys via discontinuous exposures. The corrosion products were characterized using thermo-gravimetric analysis, scanning electron microscopy and X-ray diffraction. The presence of Mo, Al and Si was not significant and there was no evidence of chemical reaction of these elements. The most active elements were the Fe and Cr in the metal base.

The steel with the best performance was alloy Fe9Cr3AlSi3Mo, due to the effect of the protective oxides inclusive in presence of the aggressive salts. Clustering is an unsupervised process to determine which unlabeled objects in a set share interesting properties. The objects are grouped into k subsets clusters whose elements optimize a proximity measure.

Methods based on information theory have proven to be feasible alternatives. They attempt to minimize the entropy of each cluster. We propose a clustering method based on the maximum entropy principle. Such a method explores the space of all possible probability distributions of the data to find one that maximizes the entropy subject to extra conditions based on prior information about the clusters.

Meaning of "chomite" in the Spanish dictionary

As a consequence of such a principle, those distributions of high entropy that satisfy the conditions are favored over others. Searching the space to find the optimal distribution of object in the clusters represents a hard combinatorial problem, which disallows the use of traditional optimization techniques. Genetic algorithms are a good alternative to solve this problem.

We benchmark our method relative to the best theoretical performance, which is given by the Bayes classifier when data are normally distributed, and a multilayer perceptron network, which offers the best practical performance when data are not normal. In general, a supervised classification method will outperform a non-supervised one, since, in the first case, the elements of the classes are known a priori. This clearly exhibits the superiority of our method. One of the basic endeavors in Pattern Recognition and particularly in Data Mining is the process of determining which unlabeled objects in a set do share interesting properties.

This implies a singular process of classification usually denoted as "clustering", where the objects are grouped into k subsets clusters in accordance with an appropriate measure of likelihood. Clustering can be considered the most important unsupervised learning problem. The more traditional clustering methods are based on the minimization of a similarity criteria based on a metric or distance.

This fact imposes important constraints on the geometry of the clusters found. Since each element in a cluster lies within a radial distance relative to a given center, the shape of the covering or hull of a cluster is hyper-spherical convex which sometimes does not encompass adequately the elements that belong to it. For this reason we propose to solve the clustering problem through the optimization of Shannon's Entropy. The optimization of this criterion represents a hard combinatorial problem which disallows the use of traditional optimization techniques, and thus, the use of a very efficient optimization technique is necessary.

We consider that Genetic Algorithms are a good alternative. We show that our method allows to obtain successfull results for problems where the clusters have complex spatial arrangements. Such method obtains clusters with non-convex hulls that adequately encompass its elements. We statistically show that our method displays the best performance that can be achieved under the assumption of normal distribution of the elements of the clusters. We also show that this is a good alternative when this assumption is not met. We present a novel approach to disentangle the effects of ideology, partisanship, and constituency pressures on roll-call voting.

First, we place voters and legislators on a common ideological space. Finally, we use a structural equation model to account for these separate effects on legislative voting. We rely on public opinion data and a survey of Argentine legislators conducted in — Legislators in presidential countries use a variety of mechanisms to advance their electoral careers and connect with relevant constituents. The most frequently studied activities are bill initiation, co-sponsoring, and legislative speeches. In this paper, the authors examine legislators' information requests i.

The authors focus on the case of Chile - where strong and cohesive national parties coexist with electoral incentives that emphasise the personal vote - to examine the links between party responsiveness and legislators' efforts to connect with their electoral constituencies. Making use of a new database of parliamentary questions and a comprehensive sample of geographical references, the authors examine how legislators use this mechanism to forge connections with voters, and find that targeted activities tend to increase as a function of electoral insecurity and progressive ambition.

Recent efforts to theorize the role of emotions in political life have stressed the importance of sympathy, and have often recurred to Adam Smith to articulate their claims. In the early twentieth-century, Max Scheler disputed the salutary character of sympathy, dismissing it as an ultimately perverse foundation for human association.

Unlike later critics of sympathy as a political principle, Scheler rejected it for being ill equipped to salvage what, in his opinion, should be the proper basis of morality, namely, moral value. Even if Scheler's objections against Smith's project prove to be ultimately mistaken, he had important reasons to call into question its moral purchase in his own time. Where the most dangerous idol is not self-love but illusory self-knowledge, the virtue of self-command will not suffice. We present an algorithm that automatically segments and classifies the brain structures in a set of magnetic resonance MR brain images using expert information contained in a small subset of the image set.

The algorithm is intended to do the segmentation and classification tasks mimicking the way a human expert would reason. The algorithm uses a knowledge base taken from a small subset of semiautomatically classified images that is combined with a set of fuzzy indexes that capture the experience and expectation a human expert uses during recognition tasks. The fuzzy indexes are tissue specific and spatial specific, in order to consider the biological variations in the tissues and the acquisition inhomogeneities through the image set.

The brain structures are segmented and classified one at a time. For each brain structure the algorithm needs one semiautomatically classified image and makes one pass through the image set. The algorithm uses low-level image processing techniques on a pixel basis for the segmentations, then validates or corrects the segmentations, and makes the final classification decision using higher level criteria measured by the set of fuzzy indexes.

We use single-echo MR images because of their high volumetric resolution; but even though we are working with only one image per brain slice, we have multiple sources of information on each pixel: absolute and relative positions in the image, gray level value, statistics of the pixel and its three-dimensional neighborhood and relation to its counterpart pixels in adjacent images. We have validated our algorithm for ease of use and precision both with clinical experts and with measurable error indexes over a Brainweb simulated MR set. We present an attractive methodology for the compression of facial gestures that can be used to drive interaction in real time applications.

Using the eigenface method we build compact representation spaces for a variety of facial gestures. These compact spaces are the so called eigenspaces. We do real time tracking and segmentation of facial features from video images and then use the eigenspaces to find compact descriptors of the segmented features. We use the system for an avatar videoconference application where we achieve real time interactivity with very limited bandwidth requirements.

The system can also be used as a hands free man-machine interface. We use interactive virtual environments for cognitive behavioral therapy. Working together with children therapists and psychologists, our computer graphics group developed 5 interactive simulators for the treatment of fears and behavior disorders.

The simulators run in real time on P4 PCs with graphic accelerators, but also work online using streaming techniques and Web VR engines. The construction of the simulators starts with ideas and situations proposed by the psychologists, these ideas are then developed by graphic designers and finally implemented in 3D virtual worlds by our group.

Our methodology starts with a graphic modeler to build the geometry of the virtual worlds, the models are then exported to a dedicated OpenGL VR engine that can interface with any VR peripheral. Alternatively, the models can be exported to a Web VR engine. The simulators are cost efficient since they require not much more than the PC and the graphics card. We have found that both the therapists and the children that use the simulators find this technology very attractive.

We consider the curved 4-body problems on spheres and hyperbolic spheres. After obtaining a criterion for the existence of quadrilateral configurations on the equator of the sphere, we study two restricted 4-body problems, one in which two masses are negligible and another in which only one mass is negligible. In the former, we prove the evidence square-like relative equilibria, whereas in the latter we discuss the existence of kite-shaped relative equilibria. In both cases, 2D and 3D, we verify the existence of bifurcation, that is, for a same set of masses we determine two new central configurations.

Fondo de Cultura Económica

The computation of the bifurcations, as well as their pictures have been performed considering homogeneous force laws with exponent a Medical image segmentation is one of the most productive research areas in medical image processing. The goal of most new image segmentation algorithms is to achieve higher segmentation accuracy than existing algorithms. But the issue of quantitative, reproducible validation of segmentation results, and the questions: What is segmentation accuracy? The creation of a validation framework is relevant and necessary for consistent and realistic comparisons of existing, new and future segmentation algorithms.

An important component of a reproducible and quantitative validation framework for segmentation algorithms is a composite index that will measure segmentation performance at a variety of levels. In this paper we present a prototype composite index that includes the measurement of seven metrics on segmented image sets. We explain how the composite index is a more complete and robust representation of algorithmic performance than currently used indices that rate segmentation results using a single metric. Our proposed index can be read as an averaged global metric or as a series of algorithmic ratings that will allow the user to compare how an algorithm performs under many categories.

We calibrate the cost of sovereign defaults using a continuous time model, where government default decisions may trigger a change in the regime of a stochastic TFP process. We calibrate the model to a sample of European countries from to By comparing the estimated drift in default relative to that in no-default, we find that TFP falls in the range of 3. The model is consistent with observed falls in GDP growth rates and subsequent recoveries and illustrates why fiscal multipliers are small during sovereign debt crises.

Employment to population ratios differ markedly across Organization for Economic Cooperation and Development OECD countries, especially for people aged over 55 years. In addition, social security features differ markedly across the OECD, particularly with respect to features such as generosity, entitlement ages, and implicit taxes on social security benefits. This study postulates that differences in social security features explain many differences in employment to population ratios at older ages. This conjecture is assessed quantitatively with a life cycle general equilibrium model of retirement.

At ages years, the correlation between the simulations of this study's model and observed data is 0. Generosity and implicit taxes are key features to explain the cross-country variation, whereas entitlement age is not. The consequences of increases in the scale of tax and transfer programs are assessed in the context of a model with idiosyncratic productivity shocks and incomplete markets. The effects are contrasted with those obtained in a stand-in house hold model featuring no idiosyncratic shocks and complete markets. The main finding is that the impact on hours remains very large, but the welfare consequences are very different.

The analysis also suggests that tax and transfer policies have large effects on average labor productivity via selection effects on employment. It starts with a conceptual distinction between rigidity and supremacy. Subsequently those categories are used to analyze mexican system and to question the amendment process capability to guarantee constitutional supremacy. Finally, the paper makes some proposals to amend the Mexican constitutional amendment process in order to make it more democratic, deliberative and effective to guarantee constitutional supremacy.

Popular constitutionalism is a contemporary constitutional theory with a critical view of U. S' constitutional narrative focus on judicial supremacy. Instead, popular constitutionalism regards the people as main actor. It defends an anti-elitist understanding of constitutional law. From the institutional perspective, popular constitutionalism proposes a weak model of constitutionalism and a strong participatory democracy. Asimismo, se plantean algunos retos e inquietudes que suscita la sentencia a la luz del activismo judicial que puede conllevar.

For now on the Supreme Court is able to judge the stigmatizing messages of law. Finally, I express some thoughts about the issues that this judgment could pose to the Supreme Court. Business cycles in emerging economies display very volatile consumption and strongly countercyclical trade balance. We show that aggregate consumption in these economies is not more volatile than output once durables are accounted for.

Then, we present and estimate a real business cycles model for a small open economy that accounts for this empirical observation. Our results show that the role of permanent shocks to aggregate productivity in explaining cyclical fluctuations in emerging economies is considerably lower than previously documented. Moreover, we find that financial frictions are crucial to explain some key business cycle properties of these economies. We use intraday data to compute weekly realized variance, skewness, and kurtosis for equity returns and study the realized moments' time-series and cross-sectional properties.

We investigate if this week's realized moments are informative for the cross-section of next week's stock returns. We find a very strong negative relationship between realized skewness and next week's stock returns. A trading strategy that buys stocks in the lowest realized skewness decile and sells stocks in the highest realized skewness decile generates an average weekly return of 19 basis points with a t-statistic of 3.

Our results on realized skewness are robust across a wide variety of implementations, sample periods, portfolio weightings, and firm characteristics, and are not captured by the Fama-French and Carhart factors. We find some evidence that the relationship between realized kurtosis and next week's stock returns is positive, but the evidence is not always robust and statistically significant. We do not find a strong relationship between realized volatility and next week's stock returns.

This paper empirically investigates the determinants of the Internet and cellular phone penetration levels in a crosscountry setting. It offers a framework to explain differences in the use of information and communication technologies in terms of differences in the institutional environment and the resulting investment climate. Mobile phone networks, on the other hand, are built on less site-specific, re-deployable modules, which make this technology less dependent on institutional characteristics. It is speculated that the existence of telecommunications technology that is less sensitive to the parameters of the institutional environment and, in particular, to poor investment protection provides an opportunity for better understanding of the constraints and prospects for economic development.

This problem consists in the analysis of the dynamics of an infinitesimal mass particle attracted by two primaries of identical masses describing elliptic relative equilibria of the two body problem on Mk2, i. The Hamiltonian formulation of this problem is pointed out in intrinsic coordinates. In this sense, we describe the number of equilibria and its linear stability depending on its bifurcation parameter corresponding to the radial parameter a. After that, we prove the existence of families of periodic orbits and KAM 2-tori related to these orbits.

We classify and analyze the orbits of the Kepler problem on surfaces of constant curvature both positive and negative, S2 and H2, respectively as functions of the angular momentum and the energy. Hill's regions are characterized and the problem of time-collision is studied. We also regularize the problem in Cartesian and intrinsic coordinates, depending on the constant angular momentum, and we describe the orbits of the regularized vector field. The phase portraits both for S2 and H2 are pointed out.

We consider a setup in which a principal must decide whether or not to legalize a socially undesirable activity. The law is enforced by a monitor who may be bribed to conceal evidence of the offense and who may also engage in extortionary practices. The principal may legalize the activity even if it is a very harmful one.

The principal may also declare the activity illegal knowing that the monitor will abuse the law to extract bribes out of innocent people. Our model offers a novel rationale for legalizing possession and consumption of drugs while continuing to prosecute drug dealers. We study a channel through which inflation can have effects on the real economy. Using job creation and destruction data from U. These results are robust to controls for the real-business cycle and monetary policy. Over a longer time frame, data on business failures confirm our results obtained from job creation and destruction data.

We discuss how interaction of inflation with financial-markets, nominal-wage rigidities, and imperfect competition could explain the empirical evidence. We study how discount window policy affects the frequency of banking crises, the level of investment, and the scope for indeterminacy of equilibrium. Previous work has shown that providing costless liquidity through a discount window has mixed effects in terms of these criteria: It prevents episodes of high liquidity demand from causing crises but can lead to indeterminacy of stationary equilibrium and to inefficiently low levels of investment.

We show how offering discount window loans at an above-market interest rate can be unambiguously beneficial. Such a policy generates a unique stationary equilibrium. Banking crises occur with positive probability in this equilibrium and the level of investment is suboptimal, but a proper combination of discount window and monetary policies can make the welfare effects of these inefficiencies arbitrarily small. The near-optimal policies can be viewed as approximately implementing the Friedman rule.

We investigate the dependence of the dynamic behavior of an endogenous growth model on the degree of returns to scale. We focus on a simple but representative growth model with publicly funded inventive activity. We show that constant returns to reproducible factors the leading case in the endogenous growth literature is a bifurcation point, and that it has the characteristics of a transcritical bifurcation.

The bifurcation involves the boundary of the state space, making it difficult to formally verify this classification. For a special case, we provide a transformation that allows formal classification by existing methods. We discuss the new methods that would be needed for formal verification of transcriticality in a broader class of models. We evaluate the desirability of having an elastic currency generated by a lender of last resort that prints money and lends it to banks in distress. When banks cannot borrow, the economy has a unique equilibrium that is not Pareto optimal.

The introduction of unlimited borrowing at a zero nominal interest rate generates a steady state equilibrium that is Pareto optimal. However, this policy is destabilizing in the sense that it also introduces a continuum of nonoptimal inflationary equilibria. We explore two alternate policies aimed at eliminating such monetary instability while preserving the steady-state benefits of an elastic currency.

If the lender of last resort imposes an upper bound on borrowing that is low enough, no inflationary equilibria can arise. For some but not all economies, the unique equilibrium under this policy is Pareto optimal. If the lender of last resort instead charges a zero real interest rate, no inflationary equilibria can arise.

The unique equilibrium in this case is always Pareto optimal. We consider the nature of the relationship between the real exchange rate and capital formation. We present a model of a small open economy that produces and consumes two goods, one tradable and one not. Domestic residents can borrow and lend abroad, and costly state verification CSV is a source of frictions in domestic credit markets. The real exchange rate matters for capital accumulation because it affects the potential for investors to provide internal finance, which mitigates the CSV problem.

We demonstrate that the real exchange rate must monotonically approach its steady state level. However, capital accumulation need not be monotonic and real exchange rate appreciation can be associated with either a rising or a falling capital stock. The relationship between world financial market conditions and the real exchange rate is also investigated. With about 3. However, the security criteria envisioned 30 years ago, when the standard was designed, are no longer sufficient to ensure the security and privacy of the users.

Furthermore, even with the newest fourth generation 4G cellular technologies starting to be deployed, these networks could never achieve strong security guarantees because the MNOs keep backwards- compatibility given the huge amount of GSM subscribers. In this paper, we present and describe the tools and necessary steps to perform an active attack against a GSM-compatible network, by exploiting the GSM protocol lack of mutual authentication between the subscribers and the network.

The attack consists of a so-called man-in-the- middle attack implementation. By using Software Defined Radio SDR , open-source libraries and open- source hardware, we setup a fake GSM base station to impersonate the network and therefore eavesdrop any communications that are being routed through it and extract information from their victims. Finally, we point out some implications of the protocol vulnerabilities and how these can not be mitigated in the short term since 4G deployments will take long time to entirely replace the current GSM infrastructure.

It is shown in the paper that the problem of speed observation for mechanical systems that are partially linearisable via coordinate changes admits a very simple and robust exponentially stable solution with a Luenberger-like observer. This result should be contrasted with the very complicated observers based on immersion and invariance reported in the literature.

A second contribution of the paper is to compare, via realistic simulations and highly detailed experiments, the performance of the proposed observer with well-known high-gain and sliding mode observers. In particular, to show that — due to their high sensitivity to noise, that is unavoidable in mechanical systems applications — the performance of the two latter designs is well below par.

We formulate a p-median facility location model with a queuing approximation to determine the optimal locations of a given number of dispensing sites Point of Dispensing-PODs from a predetermined set of possible locations and the optimal allocation of staff to the selected locations. Specific to an anthrax attack, dispensing operations should be completed in 48 hours to cover all exposed and possibly exposed people. A nonlinear integer programming model is developed and it formulates the problem of determining the optimal locations of facilities with appropriate facility deployment strategies, including the amount of servers with different skills to be allocated to each open facility.

The objective of the mathematical model is to minimize the average transportation and waiting times of individuals to receive the required service. The mathematical model has waiting time performance measures approximated with a queuing formula and these waiting times at PODs are incorporated into the p-median facility location model. Use the link below to share a full-text version of this article with your friends and colleagues.

Learn more. If you have previously obtained access with your personal account, Please log in. If you previously purchased this article, Log in to Readcube. Log out of Readcube. Click on an option below to access. Log out of ReadCube. Volume 33 , Issue 4. If you do not receive an email within 10 minutes, your email address may not be registered, and you may need to create a new Wiley Online Library account. If the address matches an existing account you will receive an email with instructions to retrieve your username.

John T. Tools Request permission Export citation Add to favorites Track citation. Share Give access Share full text access.

José Revueltas Papers,

Share full text access. Please review our Terms and Conditions of Use and check box below to share full-text version of article. Get access to the full version of this article. View access options below. You previously purchased this article through ReadCube. Institutional Login.