# downtownplus.info - Updated robots.txt # Purpose: Prevent Google from indexing query parameter URLs User-agent: * Allow: / # Prevent indexing of query parameter URLs Disallow: /*?*tag= Disallow: /*?*utm_source= Disallow: /*?*utm_medium= Disallow: /*?*utm_campaign= Disallow: /*?*utm_term= Disallow: /*?*utm_content= Disallow: /*?*ref= Disallow: /*?*source= Disallow: /*?*fbclid= Disallow: /*?*gclid= # Allow specific API endpoints Allow: /api/* # Sitemap Sitemap: https://downtownplus.info/sitemap-articles.xml # Deployment: # 1. Backup current robots.txt: # scp -i ~/.ssh/vps_deploy_key deploy@162.43.41.64:/var/www/downtownplus-info/robots.txt /tmp/robots.txt.backup-20251103 # # 2. Upload new version: # scp -i ~/.ssh/vps_deploy_key /tmp/robots-txt-utm-rules.txt deploy@162.43.41.64:/tmp/robots-new.txt # # 3. Move to production (via SSH): # ssh -i ~/.ssh/vps_deploy_key deploy@162.43.41.64 # sudo cp /tmp/robots-new.txt /var/www/downtownplus-info/robots.txt # sudo chown www-data:www-data /var/www/downtownplus-info/robots.txt # sudo chmod 644 /var/www/downtownplus-info/robots.txt # # 4. Verify: # curl https://downtownplus.info/robots.txt # # 5. Submit to Google Search Console: # - Go to https://search.google.com/search-console # - Property: downtownplus.info # - Settings → robots.txt # - Test and Submit updated robots.txt