06519nam 2200829Ii 45000010011000000030004000110050017000150060024000320070015000560080041000710200030001120200025001420240028001670400030001950500025002250820016002501000029002662450163002952640056004583000054005143360021005683370026005893380032006154900089006475040057007365050175007935050169009685050117011375050198012545050092014525050104015445050102016485050153017505050109019035050111020125050113021235050136022365050169023725050224025415050054027655050152028195050174029715060065031455100019032105100023032295100011032525100011032635100024032745100039032985100031033375100019033685100026033875201351034135240243047645300029050075380036050365380047050725880046051196500039051656500046052046500029052506550022052797000026053017000029053277000030053567000028053867000044054147100032054587760035054908300089055258560075056142200000089NOW20220323190106.0m eo d cr cn |||m|||a190401s2021 maua ob 000 0 eng d a9781680839135qelectronic z9781680839128qprint7 a10.1561/22000000892doi aCaBNVSLcCaBNVSLdCaBNVSL 4aQA76.87b.G57 2021eb04a006.3/22231 aGirin, Laurent,eauthor.10aDynamical variational autoencoders :ba comprehensive review /cLaurent Girin, Xiaoyu Bie, Thomas Hueber, Simon Leglaive, Julien Diard, Xavier Alameda-Pineda. 1a[Hanover, Massachusetts] :bNow Publishers,c2021. a1 PDF (pages 1-175) :billustrations (some color) atext2rdacontent aelectronic2isbdmedia aonline resource2rdacarrier1 aFoundations and trends in machine learning,x1935-8245 ;vvol. 15: no. 122, pp 1-175 aIncludes bibliographical references (pages 159-175).0 aIntroduction. 1.1. Deep dynamical bayesian networks ; 1.2. Variational inference and VAEs ; 1.3. Dynamical VAEs ; 1.4. Aim, contributions, and outline of the monograph --8 a2. Variational autoencoders. 2.1. Principle ; 2.2. VAE generative model ; 2.3. Learning with variational inference ; 2.4. VAE inference model ; 2.5. VAE training --8 a3. Recurrent neural networks and state space models. 3.1. Recurrent neural networks ; 3.2. State space models --8 a4. Definition of dynamical VAEs. 4.1. Generative model ; 4.2. Inference model ; 4.3. VLB and training of DVAEs ; 4.4. Additional dichotomy for autoregressive DVAE models ; 4.5. DVAE summary -- 8 a5. Deep Kalman filters. 5.1. Generative model ; 5.2. Inference model ; 5.3. Training --8 a6. Kalman variational autoencoders. 6.1. Generative model ; 6.2. Inference model ; 6.3. Training --8 a7. STOchastic recurrent networks. 7.1. Generative model ; 7.2. Inference model ; 7.3. Training --8 a8. Variational recurrent neural networks. 8.1. Generative model ; 8.2. Inference model ; 8.3. Training ; 8.4. Improved VRNN and VRNN applications --8 a9. Stochastic recurrent neural networks. 9.1. Generative model ; 9.2. Inference model ; 9.3. Training --8 a10. Recurrent variational autoencoders. 10.1. Generative model ; 10.2. Inference model ; 10.3. Training --8 a11. Disentangled sequential autoencoders. 11.1. Generative model ; 11.2. Inference model ; 11.3. Training --8 a12. Brief tour of other models. 12.1. Models related to DKF ; 12.2. Models related to STORN, VRNN, and SRNN ; 12.3. Other models --8 a13. Experiments. 13.1. DVAE architectures ; 13.2. Experimental protocol ; 13.3. Results on speech data ; 13.4. Results on 3D human motion data ; 13.5. Conclusion --8 a14. Discussion. 14.1. Fundamental motivation for DVAEs ; 14.2. DVAE outcome: a story of flexibility ; 14.3. VAE improvements and extensions applicable to DVAEs ; 14.4 Perspectives on source coding -- Acknowledgements --8 aAppendices. A. Marginalization of het in STORN --8 aB. Implementation of the DVAE models for the experiments with speech data. B.1. DKF ; B.2. STORN ; B.3. VRNN ; B.4. SRNN ; B.5. RVAE ; B.6. DSAE --8 aC. Implementation of the DVAE models for the experiments with 3D human motion data. C.1. DKF ; C.2. STORN ; C.3. VRNN ; C.4. SRNN ; C.5. RVAE ; C.6. DSAE -- References. aRestricted to subscribers or individual document purchasers.0 aGoogle Scholar0 aGoogle Book Search0 aINSPEC0 aScopus0 aACM Computing Guide0 aDBLP Computer Science Bibliography0 aZentralblatt MATH Database0 aAMS MathSciNet0 aACM Computing Reviews3 aVariational autoencoders (VAEs) are powerful deep generative models widely used to represent high-dimensional complex data through a low-dimensional latent space learned in an unsupervised manner. In the original VAE model, the input data vectors are processed independently. Recently, a series of papers have presented different extensions of the VAE to process sequential data, which model not only the latent space but also the temporal dependencies within a sequence of data vectors and corresponding latent vectors, relying on recurrent neural networks or state-space models. In this monograph, we perform a literature review of these models. We introduce and discuss a general class of models, called dynamical variational autoencoders (DVAEs), which encompasses a large subset of these temporal VAE extensions. Then, we present in detail seven recently proposed DVAE models, with an aim to homogenize the notations and presentation lines, as well as to relate these models with existing classical temporal models. We have reimplemented those seven DVAE models and present the results of an experimental benchmark conducted on the speech analysis-resynthesis task (the PyTorch code is made publicly available). The monograph concludes with a discussion on important issues concerning the DVAE class of models and future research guidelines. aLaurent Girin, Simon Leglaive, Xiaoyu Bie, Julien Diard, Thomas Hueber and Xavier Alameda-Pineda (2021), "Dynamical Variational Autoencoders: A Comprehensive Review", Foundations and Trends in Machine Learning: Vol. 15: No. 122, pp 1-175. aAlso available in print. aMode of access: World Wide Web. aSystem requirements: Adobe Acrobat reader. aTitle from PDF (viewed on Mar. 18, 2022). 0aNeural networks (Computer science) 0aGenerative programming (Computer science) 0aArtificial intelligence. 0aElectronic books.1 aBie, Xiaoyu,eauthor.1 aHueber, Thomas,eauthor.1 aLeglaive, Simon,eauthor.1 aDiard, Julien,eauthor.1 aAlameda-Pineda, Xavier,d1985-eauthor.2 aNow Publishers,epublisher.08iPrint version:z978-1680839128 0aFoundations and trends in machine learning ;vvol. 15: No.1-2, pp 1-175.x1935-8245 483Abstract with links to full textuhttp://dx.doi.org/10.1561/220000008906163nam 2200709Ii 45000010011000000030004000110050017000150060024000320070015000560080041000710200030001120200025001420240028001670400030001950500027002250820014002521000030002662450104002962640056004003000026004563360021004823370026005033380032005294900089005615040057006505050129007075050216008365050272010525050133013245050273014575050281017305050406020115050231024175050166026485050178028145050192029925050036031845050030032205060065032505100019033155100023033345100011033575100011033685100024033795100039034035100031034425100019034735100026034925201289035185240182048075300029049895380036050185380047050545880046051016500022051476500031051696550022052007100032052227760034052548300090052888560075053782200000081NOW20220323190106.0m eo d cr cn |||m|||a190401s2021 mau ob 000 0 eng d a9781680838992qelectronic z9781680838985qprint7 a10.1561/22000000812doi aCaBNVSLcCaBNVSLdCaBNVSL 4aQA76.9.A96bH65 2021eb04a511.32231 aHolden, Sean B.,eauthor.10aMachine learning for automated theorem proving :blearning to solve SAT and QSAT /cSean B. Holden. 1a[Hanover, Massachusetts] :bNow Publishers,c2021. a1 PDF (pages 807-989) atext2rdacontent aelectronic2isbdmedia aonline resource2rdacarrier1 aFoundations and trends in machine learning,x1935-8245 ;vvol. 14: no. 6, pp 807-989 aIncludes bibliographical references (pages 962-989).0 a1. Introduction. 1.1. Coverage ; 1.2. Outline of the review ; 1.3. Limits to coverage ; 1.4. What should the reader gain? --8 a2. Algorithms for solving SAT. 2.1. The SAT problem ; 2.2. The DPLL algorithm ; 2.3. Local search SAT solvers ; 2.4. Conflict-driven clause learning ; 2.5. Portfolio solvers ; 2.6. Standard input file formats --8 a3. Machine learning. 3.1. Supervised learning ; 3.2. Unsupervised learning ; 3.3. Multi-armed bandits ; 3.4. Reinforcement learning ; 3.5. Neural networks ; 3.6. Genetic algorithms and genetic programming ; 3.7. Choosing a learning algorithm ; 3.8. Sources of data --8 a4. Extracting features from a formula. 4.1. Feature-engineered representations ; 4.2. Graph representations ; 4.3. Discussion --8 a5. Learning to identify satisfiability directly. 5.1. Early approaches to SAT as classification ; 5.2. GAs for solving SAT directly ; 5.3. SAT as classification using GNNs and NNs ; 5.4. Learning to recognize sequents ; 5.5. Differentiable solvers ; 5.6. Discussion --8 a6. Learning for portfolio SAT solvers. 6.1. Empirical hardness models ; 6.2. Portfolios: learning to select a SAT solver ; 6.3. Learning portfolios using latent classes ; 6.4. Simplified approaches to portfolio SAT solvers ; 6.5. NNs for portfolio solvers ; 6.6. Discussion --8 a7. Learning for CDCL solvers. 7.1. Learning to select a preprocessor ; 7.2. Learning to select a heuristic ; 7.3. Learning to select decision variables ; 7.4. Learning to select a restart strategy ; 7.5. Learning to delete learned clauses ; 7.6. GAs for learning CDCL heuristics ; 7.7. Learning to select solver parameters ; 7.8. Specializing a SAT solver at the source code level ; 7.9. Discussion --8 a8. Learning to improve local-search SAT solvers. 8.1. Standard variable selection heuristics for local search ; 8.2. Evolutionary learning of local search heuristics ; 8.3. Learning good parameters for local search solvers -- 8 a9. Learning to solve quantified boolean formulas. 9.1. Learning for portfolios of QSAT solvers ; 9.2. Learning in non-portfolio QSAT solvers ; 9.3. Discussion --8 a10. Learning for intuitionistic propositional logic. 10.1. Methods employing the Curry-Howard correspondence ; 10.2. Methods employing sequent calculus ; 10.3. Discussion --8 a11. Conclusion. 11.1. The structure of solvers ; 11.2. What is the appropriate level of complexity? ; 11.3. What about parallel solvers? ; 11.4. Solver competitions -- Acknowledgements --8 aAppendices. A. Abbreviations --8 aB. Symbols -- References. aRestricted to subscribers or individual document purchasers.0 aGoogle Scholar0 aGoogle Book Search0 aINSPEC0 aScopus0 aACM Computing Guide0 aDBLP Computer Science Bibliography0 aZentralblatt MATH Database0 aAMS MathSciNet0 aACM Computing Reviews3 aThe decision problem for Boolean satisfiability, generally referred to as SAT, is the archetypal NP-complete problem, and encodings of many problems of practical interest exist allowing them to be treated as SAT problems. Its generalization to quantified SAT (QSAT) is PSPACE-complete, and is useful for the same reason. Despite the computational complexity of SAT and QSAT, methods have been developed allowing large instances to be solved within reasonable resource constraints. These techniques have largely exploited algorithmic developments; however machine learning also exerts a significant influence in the development of state-ofthe- art solvers. Here, the application of machine learning is delicate, as in many cases, even if a relevant learning problem can be solved, it may be that incorporating the result into a SAT or QSAT solver is counterproductive, because the run-time of such solvers can be sensitive to small implementation changes. The application of better machine learning methods in this area is thus an ongoing challenge, with characteristics unique to the field. This work provides a comprehensive review of the research to date on incorporating machine learning into SAT and QSAT solvers, as a resource for those interested in further advancing the field. aSean B. Holden (2021), "Machine Learning for Automated Theorem Proving: Learning to Solve SAT and QSAT", Foundations and Trends in Machine Learning: Vol. 14: No. 6, pp 807-989. aAlso available in print. aMode of access: World Wide Web. aSystem requirements: Adobe Acrobat reader. aTitle from PDF (viewed on Mar. 18, 2022). 0aMachine learning. 0aAutomatic theorem proving. 0aElectronic books.2 aNow Publishers,epublisher.08iPrint version:z9781680838985 0aFoundations and trends in machine learning ;vvol. 14: no. 6, pp 807-989.x1935-8245 483Abstract with links to full textuhttp://dx.doi.org/10.1561/220000008104675nam 2200625Ii 45000010011000000030004000110050017000150060024000320070015000560080041000710200030001120200025001420240028001670400030001950500023002250820018002481000037002662450121003032640056004243000024004803360021005043370026005253380032005514900083005835040057006665050024007235050292007475050223010395050103012625050245013655050179016105050285017895050064020745060065021385100019022035100023022225100011022455100011022565100031022675100019022985201031023175240197033485300029035455380036035745380047036105880046036576500035037036500034037386550022037727000030037947100032038247760034038568300084038908560075039741400000066NOW20220323190106.0m eo d cr cn |||m|||a190401s2021 mau ob 000 0 eng d a9781680839012qelectronic z9781680839005qprint7 a10.1561/14000000662doi aCaBNVSLcCaBNVSLdCaBNVSL 4aK1335b.B54 2021eb04a346.066482231 aBleibtreu, Christopher,eauthor.10aAudit regulations, audit market structure, and financial reporting quality /cChristopher Bleibtreu, Ulrike Stefani. 1a[Hanover, Massachusetts] :bNow Publishers,c2021. a1 PDF (pages 1-183) atext2rdacontent aelectronic2isbdmedia aonline resource2rdacarrier1 aFoundations and trends in accounting,x1554-0650 ;vvol. 16: no. 1-2, pp 1-183 aIncludes bibliographical references (pages 160-183).0 a1. Introduction -- 8 a2. The structure of the audit market. 2.1. Introduction ; 2.2. Measures of concentration and competition ; 2.3. Empirical studies on the level of audit market concentration at the national level ; 2.4. Reasons for why today's level of audit market concentration is high ; 2.5. Summary --8 a3. Potential effects of audit market concentration. 3.1. Introduction ; 3.2. Regulators' concerns about audit market concentration ; 3.3. Empirical studies on the effects of audit market concentration ; 3.4. Summary --8 a4. Audit regulations. 4.1. Introduction ; 4.2. Selected audit market regulations ; 4.3. Summary --8 a5. Empirical results on the effects of regulations on audit quality and market structure. 5.1. Introduction ; 5.2. The effect of audit regulations on audit quality ; 5.3. The effect of audit regulations on market structure ; 5.4. Summary --8 a6. Analytical papers that consider the structure of the audit market as exogenous. 6.1. Introduction ; 6.2. The effect of audit regulations on audit quality ; 6.3. Summary --8 a7. Analytical papers that consider the structure of the audit market as endogenous. 7.1. Introduction ; 7.2. Spatial competition models ; 7.3. Spatial competition models in auditing ; 7.4. Spatial competition models investigating the effects of audit regulations ; 7.5. Summary --8 a8. Discussion -- Symbols -- Abbreviations -- References. aRestricted to subscribers or individual document purchasers.0 aGoogle Scholar0 aGoogle Book Search0 aINSPEC0 aScopus0 aZentralblatt MATH Database0 aAMS MathSciNet3 aIn order to reduce the high level of concentration in the market segment of statutory audits of listed companies and to improve audit quality, new audit market regulations have been introduced (e.g., the mandatory rotation of the audit firm in the EU and the prohibition of single-provider auditing and consulting in the EU and in the U.S.). Other measures are currently discussed (e.g., joint audits or shared audits in the UK). However, the empirical evidence as to whether such regulations have the expected effects and whether there is actually a negative correlation between concentration and audit quality is mixed. This could be because the effects of regulatory measures on auditor and auditee incentives and their effects on market structure are interdependent, and, moreover, simultaneously determine audit quality. We therefore do not only provide a structured overview of the empirical literature on the effects of audit market regulations, but also discuss how to analyze these effects based on analytical models. aChristopher Bleibtreu and Ulrike Stefani (2021), "Audit Regulations, Audit Market Structure, and Financial Reporting Quality", Foundations and Trends in Accounting: Vol. 16: No. 1-2, pp 1-183. aAlso available in print. aMode of access: World Wide Web. aSystem requirements: Adobe Acrobat reader. aTitle from PDF (viewed on Mar. 18, 2022). 0aAuditingxLaw and legislation. 0aAudited financial statements. 0aElectronic books.1 aStefani, Ulrike,eauthor.2 aNow Publishers,epublisher.08iPrint version:z9781680839005 0aFoundations and trends in accounting ;vvol. 16: no. 1-2, pp 1-183.x1554-0650 483Abstract with links to full textuhttp://dx.doi.org/10.1561/140000006604552nam 2200781Ii 4500001001300000003000400013005001700017006002400034007001500058008004100073020003000114020002500144024003000169040003000199050002200229082001600251100002800267245007700295246004900372264005600421300002500477336002100502337002600523338003200549490007700581504005700658505017200715505038700887505021101274505007301485505018501558505009801743506006501841510001901906510002301925510001101948510001101959510002401970510003901994510003102033510001902064510002602083520084702109524014202956530002903098538003603127538004703163588004603210650003903256650004103295653002903336653001703365653001603382653002203398653001103420653000903431653001203440653001203452653001703464653001103481653003703492653001103529655002203540710003203562776003403594830006503628856007703693109.00000033NOW20220323190106.0m eo d cr cn |||m|||a190401s2021 mau ob 000 0 eng d a9781680838954qelectronic z9781680838947qprint7 a10.1561/109.000000332doi aCaBNVSLcCaBNVSLdCaBNVSL 4aHD60b.B69 2021eb04a658.4082231 aBoyer, Marcel,eauthor.10aBeyond ESG :breforming capitalism and social democracy /cMarcel Boyer.3 aBeyond environmental, social, and governance 1a[Hanover, Massachusetts] :bNow Publishers,c2021. a1 PDF (pages 90-226) atext2rdacontent aelectronic2isbdmedia aonline resource2rdacarrier1 aAnnals of corporate governance,x2381-6732 ;vvol. 6: no. 2-3, pp 90-226 aIncludes bibliographical references (pages 216-226).0 a1. Introduction. 1.1. The ESG movement ; 1.2. The notion of value ; 1.3. The role of firms, entrepreneurs, and competition ; 1.4. Collective intelligence and trust --8 a2. Ethics and equity. 2.1. Environment (externalities) ; 2.2. Protecting and sharing water ; 2.3. The value of life ; 2.4. Ethical behavior (collaboration, concertation, competition) ; 2.5. Fair and equitable compensation: the music industry ; 2.6. Inequalities (income, wealth, consumption): theirmeasure and their role ; 2.7. Socially responsible behaviors and governance (ESG) --8 a3. Analyzing and reforming capitalism. 3.1. Reforming capitalism (Zingales, Mason, Tirole, The economist, Stiglitz, Aghion, Piketty) ; 3.2. The WBCSD reform plan (2020) ; 3.3. The key role of competition --8 a4. The new competition-based capitalism (NCC): a ten-point reform --8 a5.The competition-based social democracy (CSD): a ten-point plan. 5.1. The new roles of the governmental and competitive sectors ; 5.2. The CSD ten generic policies and programs --8 a6. Conclusion: NCC and CSD, drastic but not utopian reforms -- Acknowledgments -- References. aRestricted to subscribers or individual document purchasers.0 aGoogle Scholar0 aGoogle Book Search0 aINSPEC0 aScopus0 aACM Computing Guide0 aDBLP Computer Science Bibliography0 aZentralblatt MATH Database0 aAMS MathSciNet0 aACM Computing Reviews3 aSeveral voices are rising to demand an in-depth reform of capitalism in the wake of the increase in income and wealth inequalities of the last four decades, the climate urgency in a local global world, and the financial crisis of 2007-2010. At the forefront of this movement are different groups aiming at redefining the role of businesses around socially responsible stakeholdersâ€™ governance. There is a real danger that governments will be put under pressure from misinformed constituencies and will want to play Goethe's sorcerer's apprentice: too often, good intentions are but a paved road to hell. In this document, I analyze various reform projects, I discuss the concepts of ethics and equity (environment, water, life, remuneration, inequalities, ESG) and I propose projects for in-depth reforms of capitalism and social democracy. aMarcel Boyer (2021), "Beyond ESG: Reforming Capitalism and Social Democracy", Annals of Corporate Governance: Vol. 6: No. 2-3, pp 90-226. aAlso available in print. aMode of access: World Wide Web. aSystem requirements: Adobe Acrobat reader. aTitle from PDF (viewed on Mar. 18, 2022). 0aSocial responsibility of business. 0aCorporate culturexEconomic aspects. aCollective intelligence. aCompetition. aCapitalism. aSocial democracy. aValue. aESG. aEthics. aEquity. aEnvironment. aWater. aFira and equitable remuneration. aTrust. 0aElectronic books.2 aNow Publishers,epublisher.08iPrint version:z9781680838947 0aAnnals of corporate governance ;vvol. 6: no. 2-3, pp 90-226483Abstract with links to full textuhttp://dx.doi.org/10.1561/109.00000033