How to improve SEO for Serverless Websites?





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







5















I want to improve SEO (i.e., correctly index my pages on search engines) in a serverless architecture when my website is hosted on AWS S3.



As I'm using a JavaScript approach to routing (something akin to angular, but simpler) and getting dynamic content to fill metatags, I'm finding everything to be quite troublesome for scrapers without JavaScript support, like Facebook's.



I have default meta-tags already inserted and those are, of course, loaded just fine but I need the updated ones.



I know most people uses pre-rendering on a server or through something like Prerender.io but I really wanted to find an alternative that makes sense on a serverless approach.



I thought I had it figured out since Open Graph metatags allow for a "pointers" URL where you can have a "metatags-only" HTML if needed. So I was thinking of using a Lambda function to generate the HTML response with the right metatags on a GET request. The problem is since the Facebook scraper has no JavaScript support, how can I send the dynamic content on the GET request?










share|improve this question































    5















    I want to improve SEO (i.e., correctly index my pages on search engines) in a serverless architecture when my website is hosted on AWS S3.



    As I'm using a JavaScript approach to routing (something akin to angular, but simpler) and getting dynamic content to fill metatags, I'm finding everything to be quite troublesome for scrapers without JavaScript support, like Facebook's.



    I have default meta-tags already inserted and those are, of course, loaded just fine but I need the updated ones.



    I know most people uses pre-rendering on a server or through something like Prerender.io but I really wanted to find an alternative that makes sense on a serverless approach.



    I thought I had it figured out since Open Graph metatags allow for a "pointers" URL where you can have a "metatags-only" HTML if needed. So I was thinking of using a Lambda function to generate the HTML response with the right metatags on a GET request. The problem is since the Facebook scraper has no JavaScript support, how can I send the dynamic content on the GET request?










    share|improve this question



























      5












      5








      5


      1






      I want to improve SEO (i.e., correctly index my pages on search engines) in a serverless architecture when my website is hosted on AWS S3.



      As I'm using a JavaScript approach to routing (something akin to angular, but simpler) and getting dynamic content to fill metatags, I'm finding everything to be quite troublesome for scrapers without JavaScript support, like Facebook's.



      I have default meta-tags already inserted and those are, of course, loaded just fine but I need the updated ones.



      I know most people uses pre-rendering on a server or through something like Prerender.io but I really wanted to find an alternative that makes sense on a serverless approach.



      I thought I had it figured out since Open Graph metatags allow for a "pointers" URL where you can have a "metatags-only" HTML if needed. So I was thinking of using a Lambda function to generate the HTML response with the right metatags on a GET request. The problem is since the Facebook scraper has no JavaScript support, how can I send the dynamic content on the GET request?










      share|improve this question
















      I want to improve SEO (i.e., correctly index my pages on search engines) in a serverless architecture when my website is hosted on AWS S3.



      As I'm using a JavaScript approach to routing (something akin to angular, but simpler) and getting dynamic content to fill metatags, I'm finding everything to be quite troublesome for scrapers without JavaScript support, like Facebook's.



      I have default meta-tags already inserted and those are, of course, loaded just fine but I need the updated ones.



      I know most people uses pre-rendering on a server or through something like Prerender.io but I really wanted to find an alternative that makes sense on a serverless approach.



      I thought I had it figured out since Open Graph metatags allow for a "pointers" URL where you can have a "metatags-only" HTML if needed. So I was thinking of using a Lambda function to generate the HTML response with the right metatags on a GET request. The problem is since the Facebook scraper has no JavaScript support, how can I send the dynamic content on the GET request?







      javascript facebook amazon-s3 seo serverless-framework






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 21 '16 at 1:56









      Zanon

      14.8k127896




      14.8k127896










      asked Nov 20 '16 at 22:11









      João MoreiraJoão Moreira

      345




      345
























          3 Answers
          3






          active

          oldest

          votes


















          5














          If you are using S3, you must prerender the pages before uploading them. You can't call Lambda functions on the fly because the crawler will not execute JavaScript. You can't even use Prerender.io with S3.



          Suggestion:




          1. Host your website locally.

          2. Use PhanthomJS to fetch the pages and write a prerendered version.

          3. Upload each page to S3 following the page address*.


          * E.g.: the address from example.com/about/us must be mapped as a us.html file inside a folder about in your bucket root.



          Now, your users and the crawlers will see the exactly the same pages, without needing JavaScript to load the initial state. The difference is that with JavaScript enabled, your framework (Angular?) will load the JS dependencies (like routes, services, etc.) and take control like a normal SPA application. When the user click to browse another page, the SPA will reload the inner content without making a full page reload.



          Pros:




          • Easy to setup.

          • Very fast to serve content. You can also use CloudFront to improve the speed.


          Cons:




          • If you have 1000 pages (for e.g.: 1000 products that you sell in your store), you need make 1000 prerendered pages.

          • If your page data changes frequently, you need to prerender frequently.

          • Sometimes the crawler will index old content*.


          * The crawler will see the old content, but the user will probably see the current content as the SPA framework will take control of the page and load the inner content again.





          You said that you are using S3. If you want to prerender on the fly, you can't use S3. You need to use the following:



          Route 53 => CloudFront => API Gateway => Lambda



          Configure:

          - Set the API Gateway endpoint as the CloudFront origin.

          - Use "HTTPS Only" in the "Origin Policy Protocol" (CloudFront).

          - The Lambda function must be a proxy.



          In this case, your Lambda function will know the requested address and will be able to correctly render the requested HTML page.



          Pros:




          • As Lambda has access to the database, the rendered page will always be updated.


          Cons:




          • Much slower to load the webpages.






          share|improve this answer



















          • 1





            Wow very detailed answer! Thank you very much! If none other comes I'll definitely "give up" and take your suggestion. Thank you so much for your time.

            – João Moreira
            Nov 21 '16 at 10:00



















          6














          If you are willing to use CloudFront on top of your S3 bucket, there is a new possibility to solve your problem using prerender on the fly. Lambda@Edge is a new feature that allows code to be executed with low latency when a page is requested. With this, you can verify if the agent is a crawler and prerender the page for him.



          01 Dec 2016 announcement: Lambda@Edge – Preview




          Just last week, a comment that I made on Hacker News resulted in an
          interesting email from an AWS customer!




          (...)




          Here’s how he explained his problem to me:




          In order to properly get indexed by search engines and in order for
          previews of our content to show up correctly within Facebook and
          Twitter, we need to serve a prerendered version of each of our pages.
          In order to do this, every time a normal user hits our site need for
          them to be served our normal front end from Cloudfront. But if the
          user agent matches Google / Facebook / Twitter etc., we need to
          instead redirect them the prerendered version of the site.




          Without spilling any beans I let him know that we were very aware of
          this use case and that we had some interesting solutions in the works.
          Other customers have also let us know that they want to customize
          their end user experience by making quick decisions out at the edge.




          This feature is currently in preview mode (dec/2016), but you can request AWS to experiement it.






          share|improve this answer































            1














            Here's a solution that uses (and is approved by) prerender.cloud: https://github.com/sanfrancesco/prerendercloud-lambda-edge



            This uses Lambda@Edge to prerender your app via a make deploy command.



            Taken from the repo's README:




            Server-side rendering (pre-rendering) via Lambda@Edge for single-page apps hosted on CloudFront with an s3 origin.



            This is a serverless project with a make deploy command that:




            1. serverless.yml deploys 3 functions to Lambda (viewerRequest, originRequest, originResponse)

            2. deploy.js associates them with your CloudFront distribution

            3. create-invalidation.js clears/invalidates your CloudFront cache







            share|improve this answer


























              Your Answer






              StackExchange.ifUsing("editor", function () {
              StackExchange.using("externalEditor", function () {
              StackExchange.using("snippets", function () {
              StackExchange.snippets.init();
              });
              });
              }, "code-snippets");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "1"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f40709990%2fhow-to-improve-seo-for-serverless-websites%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              3 Answers
              3






              active

              oldest

              votes








              3 Answers
              3






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              5














              If you are using S3, you must prerender the pages before uploading them. You can't call Lambda functions on the fly because the crawler will not execute JavaScript. You can't even use Prerender.io with S3.



              Suggestion:




              1. Host your website locally.

              2. Use PhanthomJS to fetch the pages and write a prerendered version.

              3. Upload each page to S3 following the page address*.


              * E.g.: the address from example.com/about/us must be mapped as a us.html file inside a folder about in your bucket root.



              Now, your users and the crawlers will see the exactly the same pages, without needing JavaScript to load the initial state. The difference is that with JavaScript enabled, your framework (Angular?) will load the JS dependencies (like routes, services, etc.) and take control like a normal SPA application. When the user click to browse another page, the SPA will reload the inner content without making a full page reload.



              Pros:




              • Easy to setup.

              • Very fast to serve content. You can also use CloudFront to improve the speed.


              Cons:




              • If you have 1000 pages (for e.g.: 1000 products that you sell in your store), you need make 1000 prerendered pages.

              • If your page data changes frequently, you need to prerender frequently.

              • Sometimes the crawler will index old content*.


              * The crawler will see the old content, but the user will probably see the current content as the SPA framework will take control of the page and load the inner content again.





              You said that you are using S3. If you want to prerender on the fly, you can't use S3. You need to use the following:



              Route 53 => CloudFront => API Gateway => Lambda



              Configure:

              - Set the API Gateway endpoint as the CloudFront origin.

              - Use "HTTPS Only" in the "Origin Policy Protocol" (CloudFront).

              - The Lambda function must be a proxy.



              In this case, your Lambda function will know the requested address and will be able to correctly render the requested HTML page.



              Pros:




              • As Lambda has access to the database, the rendered page will always be updated.


              Cons:




              • Much slower to load the webpages.






              share|improve this answer



















              • 1





                Wow very detailed answer! Thank you very much! If none other comes I'll definitely "give up" and take your suggestion. Thank you so much for your time.

                – João Moreira
                Nov 21 '16 at 10:00
















              5














              If you are using S3, you must prerender the pages before uploading them. You can't call Lambda functions on the fly because the crawler will not execute JavaScript. You can't even use Prerender.io with S3.



              Suggestion:




              1. Host your website locally.

              2. Use PhanthomJS to fetch the pages and write a prerendered version.

              3. Upload each page to S3 following the page address*.


              * E.g.: the address from example.com/about/us must be mapped as a us.html file inside a folder about in your bucket root.



              Now, your users and the crawlers will see the exactly the same pages, without needing JavaScript to load the initial state. The difference is that with JavaScript enabled, your framework (Angular?) will load the JS dependencies (like routes, services, etc.) and take control like a normal SPA application. When the user click to browse another page, the SPA will reload the inner content without making a full page reload.



              Pros:




              • Easy to setup.

              • Very fast to serve content. You can also use CloudFront to improve the speed.


              Cons:




              • If you have 1000 pages (for e.g.: 1000 products that you sell in your store), you need make 1000 prerendered pages.

              • If your page data changes frequently, you need to prerender frequently.

              • Sometimes the crawler will index old content*.


              * The crawler will see the old content, but the user will probably see the current content as the SPA framework will take control of the page and load the inner content again.





              You said that you are using S3. If you want to prerender on the fly, you can't use S3. You need to use the following:



              Route 53 => CloudFront => API Gateway => Lambda



              Configure:

              - Set the API Gateway endpoint as the CloudFront origin.

              - Use "HTTPS Only" in the "Origin Policy Protocol" (CloudFront).

              - The Lambda function must be a proxy.



              In this case, your Lambda function will know the requested address and will be able to correctly render the requested HTML page.



              Pros:




              • As Lambda has access to the database, the rendered page will always be updated.


              Cons:




              • Much slower to load the webpages.






              share|improve this answer



















              • 1





                Wow very detailed answer! Thank you very much! If none other comes I'll definitely "give up" and take your suggestion. Thank you so much for your time.

                – João Moreira
                Nov 21 '16 at 10:00














              5












              5








              5







              If you are using S3, you must prerender the pages before uploading them. You can't call Lambda functions on the fly because the crawler will not execute JavaScript. You can't even use Prerender.io with S3.



              Suggestion:




              1. Host your website locally.

              2. Use PhanthomJS to fetch the pages and write a prerendered version.

              3. Upload each page to S3 following the page address*.


              * E.g.: the address from example.com/about/us must be mapped as a us.html file inside a folder about in your bucket root.



              Now, your users and the crawlers will see the exactly the same pages, without needing JavaScript to load the initial state. The difference is that with JavaScript enabled, your framework (Angular?) will load the JS dependencies (like routes, services, etc.) and take control like a normal SPA application. When the user click to browse another page, the SPA will reload the inner content without making a full page reload.



              Pros:




              • Easy to setup.

              • Very fast to serve content. You can also use CloudFront to improve the speed.


              Cons:




              • If you have 1000 pages (for e.g.: 1000 products that you sell in your store), you need make 1000 prerendered pages.

              • If your page data changes frequently, you need to prerender frequently.

              • Sometimes the crawler will index old content*.


              * The crawler will see the old content, but the user will probably see the current content as the SPA framework will take control of the page and load the inner content again.





              You said that you are using S3. If you want to prerender on the fly, you can't use S3. You need to use the following:



              Route 53 => CloudFront => API Gateway => Lambda



              Configure:

              - Set the API Gateway endpoint as the CloudFront origin.

              - Use "HTTPS Only" in the "Origin Policy Protocol" (CloudFront).

              - The Lambda function must be a proxy.



              In this case, your Lambda function will know the requested address and will be able to correctly render the requested HTML page.



              Pros:




              • As Lambda has access to the database, the rendered page will always be updated.


              Cons:




              • Much slower to load the webpages.






              share|improve this answer













              If you are using S3, you must prerender the pages before uploading them. You can't call Lambda functions on the fly because the crawler will not execute JavaScript. You can't even use Prerender.io with S3.



              Suggestion:




              1. Host your website locally.

              2. Use PhanthomJS to fetch the pages and write a prerendered version.

              3. Upload each page to S3 following the page address*.


              * E.g.: the address from example.com/about/us must be mapped as a us.html file inside a folder about in your bucket root.



              Now, your users and the crawlers will see the exactly the same pages, without needing JavaScript to load the initial state. The difference is that with JavaScript enabled, your framework (Angular?) will load the JS dependencies (like routes, services, etc.) and take control like a normal SPA application. When the user click to browse another page, the SPA will reload the inner content without making a full page reload.



              Pros:




              • Easy to setup.

              • Very fast to serve content. You can also use CloudFront to improve the speed.


              Cons:




              • If you have 1000 pages (for e.g.: 1000 products that you sell in your store), you need make 1000 prerendered pages.

              • If your page data changes frequently, you need to prerender frequently.

              • Sometimes the crawler will index old content*.


              * The crawler will see the old content, but the user will probably see the current content as the SPA framework will take control of the page and load the inner content again.





              You said that you are using S3. If you want to prerender on the fly, you can't use S3. You need to use the following:



              Route 53 => CloudFront => API Gateway => Lambda



              Configure:

              - Set the API Gateway endpoint as the CloudFront origin.

              - Use "HTTPS Only" in the "Origin Policy Protocol" (CloudFront).

              - The Lambda function must be a proxy.



              In this case, your Lambda function will know the requested address and will be able to correctly render the requested HTML page.



              Pros:




              • As Lambda has access to the database, the rendered page will always be updated.


              Cons:




              • Much slower to load the webpages.







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Nov 21 '16 at 1:46









              ZanonZanon

              14.8k127896




              14.8k127896








              • 1





                Wow very detailed answer! Thank you very much! If none other comes I'll definitely "give up" and take your suggestion. Thank you so much for your time.

                – João Moreira
                Nov 21 '16 at 10:00














              • 1





                Wow very detailed answer! Thank you very much! If none other comes I'll definitely "give up" and take your suggestion. Thank you so much for your time.

                – João Moreira
                Nov 21 '16 at 10:00








              1




              1





              Wow very detailed answer! Thank you very much! If none other comes I'll definitely "give up" and take your suggestion. Thank you so much for your time.

              – João Moreira
              Nov 21 '16 at 10:00





              Wow very detailed answer! Thank you very much! If none other comes I'll definitely "give up" and take your suggestion. Thank you so much for your time.

              – João Moreira
              Nov 21 '16 at 10:00













              6














              If you are willing to use CloudFront on top of your S3 bucket, there is a new possibility to solve your problem using prerender on the fly. Lambda@Edge is a new feature that allows code to be executed with low latency when a page is requested. With this, you can verify if the agent is a crawler and prerender the page for him.



              01 Dec 2016 announcement: Lambda@Edge – Preview




              Just last week, a comment that I made on Hacker News resulted in an
              interesting email from an AWS customer!




              (...)




              Here’s how he explained his problem to me:




              In order to properly get indexed by search engines and in order for
              previews of our content to show up correctly within Facebook and
              Twitter, we need to serve a prerendered version of each of our pages.
              In order to do this, every time a normal user hits our site need for
              them to be served our normal front end from Cloudfront. But if the
              user agent matches Google / Facebook / Twitter etc., we need to
              instead redirect them the prerendered version of the site.




              Without spilling any beans I let him know that we were very aware of
              this use case and that we had some interesting solutions in the works.
              Other customers have also let us know that they want to customize
              their end user experience by making quick decisions out at the edge.




              This feature is currently in preview mode (dec/2016), but you can request AWS to experiement it.






              share|improve this answer




























                6














                If you are willing to use CloudFront on top of your S3 bucket, there is a new possibility to solve your problem using prerender on the fly. Lambda@Edge is a new feature that allows code to be executed with low latency when a page is requested. With this, you can verify if the agent is a crawler and prerender the page for him.



                01 Dec 2016 announcement: Lambda@Edge – Preview




                Just last week, a comment that I made on Hacker News resulted in an
                interesting email from an AWS customer!




                (...)




                Here’s how he explained his problem to me:




                In order to properly get indexed by search engines and in order for
                previews of our content to show up correctly within Facebook and
                Twitter, we need to serve a prerendered version of each of our pages.
                In order to do this, every time a normal user hits our site need for
                them to be served our normal front end from Cloudfront. But if the
                user agent matches Google / Facebook / Twitter etc., we need to
                instead redirect them the prerendered version of the site.




                Without spilling any beans I let him know that we were very aware of
                this use case and that we had some interesting solutions in the works.
                Other customers have also let us know that they want to customize
                their end user experience by making quick decisions out at the edge.




                This feature is currently in preview mode (dec/2016), but you can request AWS to experiement it.






                share|improve this answer


























                  6












                  6








                  6







                  If you are willing to use CloudFront on top of your S3 bucket, there is a new possibility to solve your problem using prerender on the fly. Lambda@Edge is a new feature that allows code to be executed with low latency when a page is requested. With this, you can verify if the agent is a crawler and prerender the page for him.



                  01 Dec 2016 announcement: Lambda@Edge – Preview




                  Just last week, a comment that I made on Hacker News resulted in an
                  interesting email from an AWS customer!




                  (...)




                  Here’s how he explained his problem to me:




                  In order to properly get indexed by search engines and in order for
                  previews of our content to show up correctly within Facebook and
                  Twitter, we need to serve a prerendered version of each of our pages.
                  In order to do this, every time a normal user hits our site need for
                  them to be served our normal front end from Cloudfront. But if the
                  user agent matches Google / Facebook / Twitter etc., we need to
                  instead redirect them the prerendered version of the site.




                  Without spilling any beans I let him know that we were very aware of
                  this use case and that we had some interesting solutions in the works.
                  Other customers have also let us know that they want to customize
                  their end user experience by making quick decisions out at the edge.




                  This feature is currently in preview mode (dec/2016), but you can request AWS to experiement it.






                  share|improve this answer













                  If you are willing to use CloudFront on top of your S3 bucket, there is a new possibility to solve your problem using prerender on the fly. Lambda@Edge is a new feature that allows code to be executed with low latency when a page is requested. With this, you can verify if the agent is a crawler and prerender the page for him.



                  01 Dec 2016 announcement: Lambda@Edge – Preview




                  Just last week, a comment that I made on Hacker News resulted in an
                  interesting email from an AWS customer!




                  (...)




                  Here’s how he explained his problem to me:




                  In order to properly get indexed by search engines and in order for
                  previews of our content to show up correctly within Facebook and
                  Twitter, we need to serve a prerendered version of each of our pages.
                  In order to do this, every time a normal user hits our site need for
                  them to be served our normal front end from Cloudfront. But if the
                  user agent matches Google / Facebook / Twitter etc., we need to
                  instead redirect them the prerendered version of the site.




                  Without spilling any beans I let him know that we were very aware of
                  this use case and that we had some interesting solutions in the works.
                  Other customers have also let us know that they want to customize
                  their end user experience by making quick decisions out at the edge.




                  This feature is currently in preview mode (dec/2016), but you can request AWS to experiement it.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Dec 4 '16 at 2:41









                  ZanonZanon

                  14.8k127896




                  14.8k127896























                      1














                      Here's a solution that uses (and is approved by) prerender.cloud: https://github.com/sanfrancesco/prerendercloud-lambda-edge



                      This uses Lambda@Edge to prerender your app via a make deploy command.



                      Taken from the repo's README:




                      Server-side rendering (pre-rendering) via Lambda@Edge for single-page apps hosted on CloudFront with an s3 origin.



                      This is a serverless project with a make deploy command that:




                      1. serverless.yml deploys 3 functions to Lambda (viewerRequest, originRequest, originResponse)

                      2. deploy.js associates them with your CloudFront distribution

                      3. create-invalidation.js clears/invalidates your CloudFront cache







                      share|improve this answer






























                        1














                        Here's a solution that uses (and is approved by) prerender.cloud: https://github.com/sanfrancesco/prerendercloud-lambda-edge



                        This uses Lambda@Edge to prerender your app via a make deploy command.



                        Taken from the repo's README:




                        Server-side rendering (pre-rendering) via Lambda@Edge for single-page apps hosted on CloudFront with an s3 origin.



                        This is a serverless project with a make deploy command that:




                        1. serverless.yml deploys 3 functions to Lambda (viewerRequest, originRequest, originResponse)

                        2. deploy.js associates them with your CloudFront distribution

                        3. create-invalidation.js clears/invalidates your CloudFront cache







                        share|improve this answer




























                          1












                          1








                          1







                          Here's a solution that uses (and is approved by) prerender.cloud: https://github.com/sanfrancesco/prerendercloud-lambda-edge



                          This uses Lambda@Edge to prerender your app via a make deploy command.



                          Taken from the repo's README:




                          Server-side rendering (pre-rendering) via Lambda@Edge for single-page apps hosted on CloudFront with an s3 origin.



                          This is a serverless project with a make deploy command that:




                          1. serverless.yml deploys 3 functions to Lambda (viewerRequest, originRequest, originResponse)

                          2. deploy.js associates them with your CloudFront distribution

                          3. create-invalidation.js clears/invalidates your CloudFront cache







                          share|improve this answer















                          Here's a solution that uses (and is approved by) prerender.cloud: https://github.com/sanfrancesco/prerendercloud-lambda-edge



                          This uses Lambda@Edge to prerender your app via a make deploy command.



                          Taken from the repo's README:




                          Server-side rendering (pre-rendering) via Lambda@Edge for single-page apps hosted on CloudFront with an s3 origin.



                          This is a serverless project with a make deploy command that:




                          1. serverless.yml deploys 3 functions to Lambda (viewerRequest, originRequest, originResponse)

                          2. deploy.js associates them with your CloudFront distribution

                          3. create-invalidation.js clears/invalidates your CloudFront cache








                          share|improve this answer














                          share|improve this answer



                          share|improve this answer








                          edited Nov 22 '18 at 11:40









                          ayaio

                          59.1k20135189




                          59.1k20135189










                          answered Nov 22 '18 at 11:40









                          Ben InadaBen Inada

                          4118




                          4118






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Stack Overflow!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f40709990%2fhow-to-improve-seo-for-serverless-websites%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              鏡平學校

                              ꓛꓣだゔៀៅຸ໢ທຮ໕໒ ,ໂ'໥໓າ໼ឨឲ៵៭ៈゎゔit''䖳𥁄卿' ☨₤₨こゎもょの;ꜹꟚꞖꞵꟅꞛေၦေɯ,ɨɡ𛃵𛁹ޝ޳ޠ޾,ޤޒޯ޾𫝒𫠁သ𛅤チョ'サノބޘދ𛁐ᶿᶇᶀᶋᶠ㨑㽹⻮ꧬ꧹؍۩وَؠ㇕㇃㇪ ㇦㇋㇋ṜẰᵡᴠ 軌ᵕ搜۳ٰޗޮ޷ސޯ𫖾𫅀ल, ꙭ꙰ꚅꙁꚊꞻꝔ꟠Ꝭㄤﺟޱސꧨꧼ꧴ꧯꧽ꧲ꧯ'⽹⽭⾁⿞⼳⽋២៩ញណើꩯꩤ꩸ꩮᶻᶺᶧᶂ𫳲𫪭𬸄𫵰𬖩𬫣𬊉ၲ𛅬㕦䬺𫝌𫝼,,𫟖𫞽ហៅ஫㆔ాఆఅꙒꚞꙍ,Ꙟ꙱エ ,ポテ,フࢰࢯ𫟠𫞶 𫝤𫟠ﺕﹱﻜﻣ𪵕𪭸𪻆𪾩𫔷ġ,ŧآꞪ꟥,ꞔꝻ♚☹⛵𛀌ꬷꭞȄƁƪƬșƦǙǗdžƝǯǧⱦⱰꓕꓢႋ神 ဴ၀க௭எ௫ឫោ ' េㇷㇴㇼ神ㇸㇲㇽㇴㇼㇻㇸ'ㇸㇿㇸㇹㇰㆣꓚꓤ₡₧ ㄨㄟ㄂ㄖㄎ໗ツڒذ₶।ऩछएोञयूटक़कयँृी,冬'𛅢𛅥ㇱㇵㇶ𥄥𦒽𠣧𠊓𧢖𥞘𩔋цѰㄠſtʯʭɿʆʗʍʩɷɛ,əʏダヵㄐㄘR{gỚṖḺờṠṫảḙḭᴮᵏᴘᵀᵷᵕᴜᴏᵾq﮲ﲿﴽﭙ軌ﰬﶚﶧ﫲Ҝжюїкӈㇴffצּ﬘﭅﬈軌'ffistfflſtffतभफɳɰʊɲʎ𛁱𛁖𛁮𛀉 𛂯𛀞నఋŀŲ 𫟲𫠖𫞺ຆຆ ໹້໕໗ๆทԊꧢꧠ꧰ꓱ⿝⼑ŎḬẃẖỐẅ ,ờỰỈỗﮊDžȩꭏꭎꬻ꭮ꬿꭖꭥꭅ㇭神 ⾈ꓵꓑ⺄㄄ㄪㄙㄅㄇstA۵䞽ॶ𫞑𫝄㇉㇇゜軌𩜛𩳠Jﻺ‚Üမ႕ႌႊၐၸဓၞၞၡ៸wyvtᶎᶪᶹစဎ꣡꣰꣢꣤ٗ؋لㇳㇾㇻㇱ㆐㆔,,㆟Ⱶヤマފ޼ޝަݿݞݠݷݐ',ݘ,ݪݙݵ𬝉𬜁𫝨𫞘くせぉて¼óû×ó£…𛅑הㄙくԗԀ5606神45,神796'𪤻𫞧ꓐ㄁ㄘɥɺꓵꓲ3''7034׉ⱦⱠˆ“𫝋ȍ,ꩲ軌꩷ꩶꩧꩫఞ۔فڱێظペサ神ナᴦᵑ47 9238їﻂ䐊䔉㠸﬎ffiﬣ,לּᴷᴦᵛᵽ,ᴨᵤ ᵸᵥᴗᵈꚏꚉꚟ⻆rtǟƴ𬎎

                              Why https connections are so slow when debugging (stepping over) in Java?