{"id":3602,"date":"2023-06-07T07:44:00","date_gmt":"2023-06-07T07:44:00","guid":{"rendered":"https:\/\/mpelembe.net\/?p=3602"},"modified":"2023-06-07T08:22:06","modified_gmt":"2023-06-07T08:22:06","slug":"what-is-chain-of-thought-prompting","status":"publish","type":"post","link":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/","title":{"rendered":"What is Chain-of-Thought Prompting?"},"content":{"rendered":"<div class=\"chat-history-scroll-container ng-tns-c3030796456-1 show-bottom-shadow ng-trigger ng-trigger-resetChat\">\n<div class=\"chat-history ng-tns-c3030796456-1 ng-star-inserted\" data-test-id=\"chat-history-container\">\n<div class=\"conversation-container ng-tns-c3030796456-1 ng-star-inserted\">\n<div class=\"ng-tns-c4020060676-21 ng-trigger ng-trigger-responsePopulation ng-animate-disabled\">\n<div class=\"response-container response-container-has-multiple-responses ng-star-inserted response-container-small-bottom\">\n<div class=\"presented-response-container\">\n<div class=\"response-container-content\">\n<div class=\"markdown\" dir=\"ltr\">\n<p>June 7, 2023 \/Technology\/ &#8212; Chain-of-thought (CoT) prompting is a method for improving the performance of large language models (LLMs) on reasoning tasks. CoT prompts encourage LLMs to explain their reasoning process by providing them with a few examples of how to do so.<\/p>\n<p>CoT prompting was first introduced in the paper &#8220;Chain-of-Thought Prompting Elicits Reasoning in Large Language Models&#8221; by Wei et al. (2022). In this paper, the authors showed that CoT prompting can significantly improve the performance of LLMs on a range of reasoning tasks, including arithmetic, commonsense, and symbolic reasoning.<!--more--><\/p>\n<p>The CoT prompting method works by first providing the LLM with a few examples of how to reason about a particular problem. For example, if the LLM is asked to solve the following arithmetic problem:<\/p>\n<div class=\"code-block\">\n<div class=\"code-block-decoration header gmat-subhead-2 ng-star-inserted\">Code snippet<\/div>\n<pre><code class=\"code-container\" role=\"text\">What is 5 + 7?\r\n<\/code><\/pre>\n<div class=\"code-block-decoration footer gmat-overline hide-from-message-actions ng-star-inserted\">the CoT prompt might provide the following examples:<\/div>\n<\/div>\n<div class=\"code-block\">\n<div class=\"code-block-decoration header gmat-subhead-2 ng-star-inserted\">Code snippet<\/div>\n<pre><code class=\"code-container\" role=\"text\">1. We can add 5 and 7 to get 12.\r\n2. We can think of 5 + 7 as 5 + (7 - 7) = 5 + 0 = 12.\r\n3. We can use a calculator to add 5 and 7 to get 12.\r\n<\/code><\/pre>\n<div class=\"code-block-decoration footer gmat-overline hide-from-message-actions ng-star-inserted\">\n<p>Here is another example of how CoT prompting can be used to improve the performance of an LLM on a math word problem:<\/p>\n<div class=\"code-block\">\n<div class=\"code-block-decoration header gmat-subhead-2 ng-star-inserted\">Code snippet<\/div>\n<pre><code class=\"code-container\" role=\"text\">Prompt: What is the sum of 1, 2, 3, 4, and 5?\r\n\r\nCoT: Here is one way to solve this problem:\r\n\r\n1. Add 1 and 2 to get 3.\r\n2. Add 3 and 4 to get 7.\r\n3. Add 7 and 5 to get 12.\r\n4. The answer is 12.\r\n\r\nNow, try to solve this problem on your own:\r\n\r\nWhat is the sum of 10, 11, 12, 13, and 14?\r\n<\/code><\/pre>\n<div class=\"code-block-decoration footer gmat-overline hide-from-message-actions ng-star-inserted\">By providing the LLM with a few examples of how to solve math word problems, CoT prompting can help the LLM to learn how to solve these problems on its own.<\/div>\n<\/div>\n<p>Once the LLM has been provided with these examples, it is then asked to solve the problem on its own. The CoT prompt encourages the LLM to use the same reasoning process that was used in the examples.<\/p><\/div>\n<\/div>\n<p><!--more--><\/p>\n<p>CoT prompting has been shown to be an effective way to improve the performance of LLMs on reasoning tasks. However, it is important to note that CoT prompting is not a silver bullet. CoT prompting can only improve the performance of LLMs on tasks that are amenable to reasoning. For example, CoT prompting is unlikely to be effective for tasks that require creativity or common sense.<\/p>\n<p>CoT prompting can also be used to improve the performance of LLMs on other types of reasoning tasks, such as commonsense and symbolic reasoning. For example, CoT prompting can be used to help LLMs to learn how to make inferences, to draw conclusions, and to solve problems that require multiple steps of reasoning.<\/p>\n<\/div>\n<div class=\"actions-container ng-star-inserted\">\n<div class=\"buttons-container ng-star-inserted\"><span style=\"font-family: Bitter, Georgia, serif; font-size: 22px;\">Search related topics<\/span><\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div id=\"factuality-form-2\" class=\"ng-tns-c4020060676-21\">\n<div class=\"factuality-container\">\n<div class=\"factuality-content\"><a class=\"factuality-link mdc-button mat-mdc-button mat-primary gmat-mdc-button-with-prefix mat-mdc-button-base gmat-mdc-button ng-star-inserted\" href=\"https:\/\/www.google.com\/search?q=Chain-of-Thought%20Prompting\" target=\"_blank\" rel=\"noopener noreferrer\" aria-disabled=\"false\"><span class=\"mdc-button__label\">Chain-of-Thought Prompting<\/span><\/a><\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div class=\"input-area-container ng-tns-c3030796456-1\">\n<div class=\"input-area\">\n<div class=\"mat-mdc-text-field-wrapper mdc-text-field ng-tns-c2119496066-2 mdc-text-field--outlined mdc-text-field--no-label\">\n<div class=\"mat-mdc-form-field-flex ng-tns-c2119496066-2\">\n<div class=\"mdc-notched-outline ng-tns-c2119496066-2 mdc-notched-outline--no-label ng-star-inserted\"><\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>June 7, 2023 \/Technology\/ &#8212; Chain-of-thought (CoT) prompting is a method for improving the performance of large language models (LLMs) on reasoning tasks. CoT<a class=\"moretag\" href=\"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/\">Read More&#8230;<\/a><\/p>\n","protected":false},"author":1,"featured_media":3603,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"googlesitekit_rrm_CAowu7GVCw:productID":"","_crdt_document":"","activitypub_content_warning":"","activitypub_content_visibility":"","activitypub_max_image_attachments":3,"activitypub_interaction_policy_quote":"anyone","activitypub_status":"","footnotes":""},"categories":[5823],"tags":[50,52,2223,53,54,2222,6570,8110],"class_list":["post-3602","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-developers","tag-articles","tag-artificial-intelligence","tag-branches-of-philosophy","tag-computational-neuroscience","tag-cybernetics","tag-philosophy","tag-reason","tag-wei"],"featured_image_src":"https:\/\/mpelembe.net\/wp-content\/uploads\/2023\/06\/Chain-of-Thought-Prompting-1024x767.png","blog_images":{"medium":"https:\/\/mpelembe.net\/wp-content\/uploads\/2023\/06\/Chain-of-Thought-Prompting-300x225.png","large":"https:\/\/mpelembe.net\/wp-content\/uploads\/2023\/06\/Chain-of-Thought-Prompting-1024x767.png"},"ams_acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>What is Chain-of-Thought Prompting? - Mpelembe Network<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"What is Chain-of-Thought Prompting? - Mpelembe Network\" \/>\n<meta property=\"og:description\" content=\"June 7, 2023 \/Technology\/ &#8212; Chain-of-thought (CoT) prompting is a method for improving the performance of large language models (LLMs) on reasoning tasks. CoTRead More...\" \/>\n<meta property=\"og:url\" content=\"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/\" \/>\n<meta property=\"og:site_name\" content=\"Mpelembe Network\" \/>\n<meta property=\"article:published_time\" content=\"2023-06-07T07:44:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-06-07T08:22:06+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/mpelembe.net\/wp-content\/uploads\/2023\/06\/Chain-of-Thought-Prompting.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1180\" \/>\n\t<meta property=\"og:image:height\" content=\"884\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"admin\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"admin\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"2 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/\"},\"author\":{\"name\":\"admin\",\"@id\":\"https:\\\/\\\/mpelembe.net\\\/#\\\/schema\\\/person\\\/2421ebbf3150931b1066b10a196d7608\"},\"headline\":\"What is Chain-of-Thought Prompting?\",\"datePublished\":\"2023-06-07T07:44:00+00:00\",\"dateModified\":\"2023-06-07T08:22:06+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/\"},\"wordCount\":359,\"image\":{\"@id\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/mpelembe.net\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Chain-of-Thought-Prompting.png\",\"keywords\":[\"Articles\",\"Artificial intelligence\",\"Branches of philosophy\",\"Computational neuroscience\",\"Cybernetics\",\"Philosophy\",\"Reason\",\"Wei\"],\"articleSection\":[\"Developers\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/\",\"url\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/\",\"name\":\"What is Chain-of-Thought Prompting? - Mpelembe Network\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/mpelembe.net\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/mpelembe.net\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Chain-of-Thought-Prompting.png\",\"datePublished\":\"2023-06-07T07:44:00+00:00\",\"dateModified\":\"2023-06-07T08:22:06+00:00\",\"author\":{\"@id\":\"https:\\\/\\\/mpelembe.net\\\/#\\\/schema\\\/person\\\/2421ebbf3150931b1066b10a196d7608\"},\"breadcrumb\":{\"@id\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/#primaryimage\",\"url\":\"https:\\\/\\\/mpelembe.net\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Chain-of-Thought-Prompting.png\",\"contentUrl\":\"https:\\\/\\\/mpelembe.net\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Chain-of-Thought-Prompting.png\",\"width\":1180,\"height\":884},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/what-is-chain-of-thought-prompting\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/mpelembe.net\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"What is Chain-of-Thought Prompting?\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/mpelembe.net\\\/#website\",\"url\":\"https:\\\/\\\/mpelembe.net\\\/\",\"name\":\"Mpelembe Network\",\"description\":\"Collaboration Platform\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/mpelembe.net\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/mpelembe.net\\\/#\\\/schema\\\/person\\\/2421ebbf3150931b1066b10a196d7608\",\"name\":\"admin\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/c66a2765397adfb52418f6f2310640167a0af23ce662da1b68c8a0b8650de556?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/c66a2765397adfb52418f6f2310640167a0af23ce662da1b68c8a0b8650de556?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/c66a2765397adfb52418f6f2310640167a0af23ce662da1b68c8a0b8650de556?s=96&d=mm&r=g\",\"caption\":\"admin\"},\"sameAs\":[\"https:\\\/\\\/mpelembe.net\"],\"url\":\"https:\\\/\\\/mpelembe.net\\\/index.php\\\/author\\\/admin\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"What is Chain-of-Thought Prompting? - Mpelembe Network","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/","og_locale":"en_US","og_type":"article","og_title":"What is Chain-of-Thought Prompting? - Mpelembe Network","og_description":"June 7, 2023 \/Technology\/ &#8212; Chain-of-thought (CoT) prompting is a method for improving the performance of large language models (LLMs) on reasoning tasks. CoTRead More...","og_url":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/","og_site_name":"Mpelembe Network","article_published_time":"2023-06-07T07:44:00+00:00","article_modified_time":"2023-06-07T08:22:06+00:00","og_image":[{"width":1180,"height":884,"url":"https:\/\/mpelembe.net\/wp-content\/uploads\/2023\/06\/Chain-of-Thought-Prompting.png","type":"image\/png"}],"author":"admin","twitter_card":"summary_large_image","twitter_misc":{"Written by":"admin","Est. reading time":"2 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/#article","isPartOf":{"@id":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/"},"author":{"name":"admin","@id":"https:\/\/mpelembe.net\/#\/schema\/person\/2421ebbf3150931b1066b10a196d7608"},"headline":"What is Chain-of-Thought Prompting?","datePublished":"2023-06-07T07:44:00+00:00","dateModified":"2023-06-07T08:22:06+00:00","mainEntityOfPage":{"@id":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/"},"wordCount":359,"image":{"@id":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/#primaryimage"},"thumbnailUrl":"https:\/\/mpelembe.net\/wp-content\/uploads\/2023\/06\/Chain-of-Thought-Prompting.png","keywords":["Articles","Artificial intelligence","Branches of philosophy","Computational neuroscience","Cybernetics","Philosophy","Reason","Wei"],"articleSection":["Developers"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/","url":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/","name":"What is Chain-of-Thought Prompting? - Mpelembe Network","isPartOf":{"@id":"https:\/\/mpelembe.net\/#website"},"primaryImageOfPage":{"@id":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/#primaryimage"},"image":{"@id":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/#primaryimage"},"thumbnailUrl":"https:\/\/mpelembe.net\/wp-content\/uploads\/2023\/06\/Chain-of-Thought-Prompting.png","datePublished":"2023-06-07T07:44:00+00:00","dateModified":"2023-06-07T08:22:06+00:00","author":{"@id":"https:\/\/mpelembe.net\/#\/schema\/person\/2421ebbf3150931b1066b10a196d7608"},"breadcrumb":{"@id":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/#primaryimage","url":"https:\/\/mpelembe.net\/wp-content\/uploads\/2023\/06\/Chain-of-Thought-Prompting.png","contentUrl":"https:\/\/mpelembe.net\/wp-content\/uploads\/2023\/06\/Chain-of-Thought-Prompting.png","width":1180,"height":884},{"@type":"BreadcrumbList","@id":"https:\/\/mpelembe.net\/index.php\/what-is-chain-of-thought-prompting\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/mpelembe.net\/"},{"@type":"ListItem","position":2,"name":"What is Chain-of-Thought Prompting?"}]},{"@type":"WebSite","@id":"https:\/\/mpelembe.net\/#website","url":"https:\/\/mpelembe.net\/","name":"Mpelembe Network","description":"Collaboration Platform","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/mpelembe.net\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/mpelembe.net\/#\/schema\/person\/2421ebbf3150931b1066b10a196d7608","name":"admin","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/c66a2765397adfb52418f6f2310640167a0af23ce662da1b68c8a0b8650de556?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/c66a2765397adfb52418f6f2310640167a0af23ce662da1b68c8a0b8650de556?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/c66a2765397adfb52418f6f2310640167a0af23ce662da1b68c8a0b8650de556?s=96&d=mm&r=g","caption":"admin"},"sameAs":["https:\/\/mpelembe.net"],"url":"https:\/\/mpelembe.net\/index.php\/author\/admin\/"}]}},"_links":{"self":[{"href":"https:\/\/mpelembe.net\/index.php\/wp-json\/wp\/v2\/posts\/3602","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mpelembe.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/mpelembe.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/mpelembe.net\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/mpelembe.net\/index.php\/wp-json\/wp\/v2\/comments?post=3602"}],"version-history":[{"count":3,"href":"https:\/\/mpelembe.net\/index.php\/wp-json\/wp\/v2\/posts\/3602\/revisions"}],"predecessor-version":[{"id":3609,"href":"https:\/\/mpelembe.net\/index.php\/wp-json\/wp\/v2\/posts\/3602\/revisions\/3609"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/mpelembe.net\/index.php\/wp-json\/wp\/v2\/media\/3603"}],"wp:attachment":[{"href":"https:\/\/mpelembe.net\/index.php\/wp-json\/wp\/v2\/media?parent=3602"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/mpelembe.net\/index.php\/wp-json\/wp\/v2\/categories?post=3602"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/mpelembe.net\/index.php\/wp-json\/wp\/v2\/tags?post=3602"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}