{"id":106,"date":"2014-05-02T10:06:56","date_gmt":"2014-05-02T10:06:56","guid":{"rendered":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/?p=106"},"modified":"2025-02-26T13:21:39","modified_gmt":"2025-02-26T13:21:39","slug":"creating-a-mata-library","status":"publish","type":"post","link":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/2014\/05\/02\/creating-a-mata-library\/","title":{"rendered":"Creating a Mata Library"},"content":{"rendered":"<p>MCMC algorithms can be slow, so it is often necessary to pay particular attention to the efficiency of\u00a0one&#8217;s <a href=\"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/files\/2014\/04\/library.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-107 alignright\" src=\"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/files\/2014\/04\/library.png\" alt=\"library\" width=\"171\" height=\"192\" \/><\/a>code and usually this means programming in Mata.\u00a0For this reason,\u00a0the slice, griddy, ARS and ARMS samplers that are described in <i>\u2018Bayesian Analysis with Stata\u2019 <\/i>were programmed using Mata even though they can be called from either Mata or Stata. For convenience, those four samplers are collected together with other Mata code referred to in the book, in a library that I called <b>libmcmc<\/b> (in Stata a library name needs to begin with lib).<\/p>\n<p>In\u00a0future postings I\u00a0will introduce Mata code for fitting\u00a0some\u00a0further\u00a0Bayesian models, so I thought that I would create a new library that I will call <b>libbayes.<\/b>\u00a0This library\u00a0will contain all of the Mata functions that I refer to in this blog. I will start today by showing how the library is created and then I will add some functions for simulating random values from a few standard distributions.<\/p>\n<p>In order to create the library, we need a do file that follows the pattern:<\/p>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>set matastrict off<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata:<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata clear<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata mlib create libbayes , dir(PERSONAL) replace<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>void prog1(\u2026)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>{ \u2026.<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>}<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata mlib add libbayes prog1()<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>void prog2(\u2026)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>{ \u2026.<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>}<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata mlib add libbayes prog2()<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u2026.<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata mlib index<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>end<\/b><\/span><\/pre>\n<p>When this do file is run, the library is created in the PERSONAL folder, which I have chosen because that directory is searched automatically by Stata when it looks for Mata libraries (type the command <b>-sysdir-<\/b> to find the location of this folder). Each program is added in turn and\u00a0then Stata is told to update its index of Mata libraries. Updating the index is only needed the first time that a new library is created\u00a0because the index is automatically updated at the start of each Stata session, but it does no harm if the index is updated unnecessarily.\u00a0I tend to be rather lazy with my programming so I have switched matastrict off. Had I switched it on, then I would have had to define the type of every structure that I use in the functions.<\/p>\n<p>As my first project I\u00a0plan to\u00a0write a function for fitting a finite mixture of multivariate normal distributions using conjugate priors and a Gibbs sampler. This code can be used\u00a0in many\u00a0different ways, for example for density smoothing, clustering or even fitting smooth lines to a scatter plot, it also serves as a convenient stepping stone to introducing non-parametric Bayesian analysis, which can be though of as an extension\u00a0of finite mixture modelling in which the number of components in the mixture becomes infinite.<\/p>\n<p>To create the Gibbs sampler for fitting a mixture of multivariate normal\u00a0distributions, we will need functions that simulate random values from the Wishart, Multivariate normal, Dirichlet and Categorical distributions, so today I will add those functions to my library.<\/p>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>set matastrict off<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata:<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata clear<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata mlib create libbayes , dir(PERSONAL) replace<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\/*-------------------------------------------- <\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>* Single categorical observation <\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>* P = probabilities of each cell (sum to 1) <\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>* returns selected cell number <\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>*--------------------------------------------*\/<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>real scalar rCategorical(real vector P)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>{<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 real scalar k,u,h,sumP<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 k=length(P)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 u=runiform(1,1)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 h=1<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 sumP=P[1]<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 while( u &gt; sumP &amp; h &lt; k ) {<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0\u00a0\u00a0 h++<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0\u00a0\u00a0 sumP=sumP+P[h]<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 }<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 return(h)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>}\u00a0\u00a0 <\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata mlib add libbayes rCategorical()<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\/*--------------------------------------------<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>* Single Dirichlet distribution - parameter alpha<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>*--------------------------------------------*\/<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>real colvector rDirichlet(real vector alpha)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>{<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 real scalar h,sP<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 real colvector P<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 h=length(alpha)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 P=J(h,1,0)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 sP=0<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 for(i=1;i&lt;=h;i++) {<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0\u00a0\u00a0 P[i]=rgamma(1,1,alpha[i],1)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0\u00a0\u00a0 sP=sP+P[i]<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 }<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 return(P\/sP)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>}<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata mlib add libbayes rDirichlet()<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\/*--------------------------------------------<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>* Single random multivariate normal variable* M = mean<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>* L = cholesky(V), V=variance matrix<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>*--------------------------------------------*\/<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>real rowvector rMNormal(real rowvector M,real matrix L)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>{<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0 return(M+L*rnormal(rows(L),1,0,1))<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>}<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata mlib add libbayes rMNormal()<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\/*--------------------------------------------<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>* Single Wishart matrix<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>* L = cholesky(invsym(R)), where R = k*variance matrix<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>* k = degree of freedom<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>*--------------------------------------------*\/<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>real matrix rWishart(real matrix L,real scalar k )<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>{\u00a0\u00a0 <\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0 p = rows(L)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0 B = J(p,p,0)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0 for(i=2;i&lt;=p;i++) {<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 for(j=1;j&lt;i;j++) {<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0 \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0B[i,j] = rnormal(1,1,0,1)<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 }<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0 }<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0 for(i=1;i&lt;=p;i++) B[i,i] = sqrt(rchi2(1,1,k-i+1))<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>\u00a0\u00a0\u00a0 return(L*B*B'*L')<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>}<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata mlib add libbayes rWishart()<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>mata mlib index<\/b><\/span><\/pre>\n<pre style=\"padding-left: 30px\"><span style=\"color: #0000ff\"><b>end<\/b><\/span><\/pre>\n<p>The multivariate normal simulator takes the cholesky decomposition of the variance matrix, V,\u00a0as its input, so it could be called as,<\/p>\n<pre><span style=\"color: #0000ff\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Y = rMNormal(Mu,cholesky(V))<\/span><\/pre>\n<p>The advantage of calculating the cholesky decomposition outside of the function is that if you want to simulate many values from distributions with the same variance matrix, then you can calculate the cholesky decomposition once and then call rMNormal() as many times as you wish.<\/p>\n<p>A similar reasoning leads us to use cholesky(invsym(R)) as the input to the Wishart function. What is more, anyone who likes to parameterize in terms of S=invsym(R) can use the same function by calling it with the argument cholesky(S).<\/p>\n<p>The Dirichlet distribution is a generalization of the Beta distribution and provides a simple prior for a set of probabilities that sum to one. Drawing random probabilities from a Dirichlet distribution is just a matter of creating a set of appropriately chosen gamma variables and then normalizing them to sum to one. This algorithm together with lots of other information on the Dirichlet distribution can be found on the Wikipedia page (<a href=\"http:\/\/en.wikipedia.org\/wiki\/Dirichlet_distribution\">http:\/\/en.wikipedia.org\/wiki\/Dirichlet_distribution<\/a> )<\/p>\n<p><em><strong>Testing the functions<\/strong><\/em><\/p>\n<p>The functions in our library will be called by numerous other programs so it is vitally important that we test them to ensure, first,\u00a0that they are correct and second, that they are robust to misuse. Robustness usually involves checking the inputs to ensure that dimensions match and constraints are fulfilled, for instance, the scalar parameter of a Wishart distribution should not be less than the dimension of the matrix. I have deliberately chosen not to include any such checks in these functions because in a Bayesian analysis they might be called\u00a0hundreds of\u00a0thousands of times and speed is important. I leave it to the calling routines to ensure that the parameters are appropriate.<\/p>\n<p>To check the accuracy of the functions we could write a series of Mata programs that call\u00a0them but I prefer the interactivity\u00a0of Stata and so I wrote four short Stata programs that call these Mata functions and I used those for testing. Here is the code that I used to test the rCategorical() function.<\/p>\n<pre style=\"padding-left: 30px\"><strong><span style=\"color: #0000ff\">program rCat<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">\u00a0  syntax newvarlist(max=1) , N(integer) P(namelist)<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">\u00a0<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">\u00a0  if `n' &gt; _N qui set obs `n'<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">\u00a0  qui gen `varlist' = .<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">\u00a0  mata: P = st_matrix(\"`p'\")<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">\u00a0  forvalues i=1\/`n' {<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">\u00a0\u00a0\u00a0\u00a0   mata: H = rCategorical(P)<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">\u00a0\u00a0     mata: st_numscalar(\"r(h)\",H)<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">\u00a0\u00a0     qui replace `varlist' = r(h) in `i'<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">\u00a0  }<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">end\u00a0<\/span><\/strong><\/pre>\n<pre style=\"padding-left: 30px\"><strong><span style=\"color: #0000ff\">matrix PR = (0.2,0.4,0.1,0.3)<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">rCat x , n(1000) p(PR)<\/span><\/strong>\r\n<strong><span style=\"color: #0000ff\">tabu x<\/span><\/strong><\/pre>\n<p>The Stata program rCat calls rCategorical() to generate n values from a single catergorical distibution with probabilities p (which must sum to one, although this is not checked). Inside the Stata program the probabilities are transferred to Mata using the st_matrix() function and then rCategorical() is called repeatedly and the results\u00a0are returned\u00a0as locals. After the Stata function has been called, the results are tabulated so that we can see\u00a0that the proportions agree with the probabilities supplied to the function. This approach is not as efficient as creating a calling function in Mata but it is simple to write and simple to use.<\/p>\n<p><em><strong>Downloading the code<\/strong><\/em><\/p>\n<p>The do file for creating this library can be downloaded from <a href=\"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/files\/2014\/04\/makeLibbayesVersion1.pdf\">makeLibbayesVersion1<\/a>. The software for this\u00a0blog will not allow me to upload a do file directly, so the link is to a pdf; however, it is simple to copy and paste the code from the pdf into the do file editor. The test programs can be downloaded from <a href=\"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/files\/2014\/05\/testprograms.pdf\">testprograms<\/a>.<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>MCMC algorithms can be slow, so it is often necessary to pay particular attention to the efficiency of\u00a0one&#8217;s code and usually this means programming in Mata.\u00a0For this reason,\u00a0the slice, griddy, ARS and ARMS samplers that are described in \u2018Bayesian Analysis with Stata\u2019 were programmed using Mata even though they can be called from either Mata [&hellip;]<\/p>\n","protected":false},"author":134,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[15],"class_list":["post-106","post","type-post","status-publish","format-standard","hentry","category-uncategorized","tag-mata-random-numbers-library"],"_links":{"self":[{"href":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/wp-json\/wp\/v2\/posts\/106","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/wp-json\/wp\/v2\/users\/134"}],"replies":[{"embeddable":true,"href":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/wp-json\/wp\/v2\/comments?post=106"}],"version-history":[{"count":9,"href":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/wp-json\/wp\/v2\/posts\/106\/revisions"}],"predecessor-version":[{"id":119,"href":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/wp-json\/wp\/v2\/posts\/106\/revisions\/119"}],"wp:attachment":[{"href":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/wp-json\/wp\/v2\/media?parent=106"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/wp-json\/wp\/v2\/categories?post=106"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/staffblogs.le.ac.uk\/bayeswithstata\/wp-json\/wp\/v2\/tags?post=106"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}