Log in

View Full Version : URL or API to check for current version of patch files?


patlefort
09-03-2025, 03:05 AM
I would like to automate version checks and updates. Does Project 1999 provide a way to query the latest version?

loramin
09-03-2025, 10:31 AM
There's no API, but there's https://www.project1999.com/ ... new patches are always posted as news items on the front page

patlefort
09-03-2025, 10:50 AM
Thanks for the answer. Scraping a website for a link is far from ideal. I'll have to think of a solution. My suggestion for the website would be to simply have a file that can be downloaded that contain the version or filename of the latest release of the patch files. Just a file containing a string.

loramin
09-03-2025, 11:14 AM
... or you could just wait until you try to login and it tells you to patch (like everyone else) :)

patlefort
09-03-2025, 06:32 PM
But why? It would be trivial to implement. I'm trying to create a launcher for linux that make it as easy as possible but for that, I need to automate the update process. Part of that process is converting everything to lower case to make sure files are all correctly overwritten. And yes I've had issues with that which is why I'm doing this.

Tann
09-05-2025, 09:59 AM
Scraping a website for a link is far from ideal

maybe not ideal but it's super simple, here:

from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as ec
from selenium.webdriver.common.by import By

options = webdriver.ChromeOptions()
options.add_experimental_option("detach", True)
options.add_experimental_option("excludeSwitches", ["enable-logging"])
driver = webdriver.Chrome(options=options)

driver.get("https://www.project1999.com/")

try:
WebDriverWait(driver, 10).until(ec.element_to_be_clickable((By.CSS_SELEC TOR, "#collapseobj_module_16 > tr > td > a:nth-child(7)"))).click()
except Exception as e:
print(f"you dun goofed: {e}")

loramin
09-05-2025, 11:09 AM
You ChatGPTed that didn't you ;)

Tann
09-05-2025, 11:11 AM
that? no, i use scrapers on the daily so i just Frankensteined some of my other code.

loramin
09-05-2025, 11:22 AM
that? no, i use scrapers on the daily so i just Frankensteined some of my other code.

Heh, sorry:

WebDriverWait(driver, 10).until(ec.element_to_be_clickable((By.CSS_SELEC TOR, "#collapseobj_module_16 > tr > td > a:nth-child(7)"))).click()

looked like a selector (and just code in general) an AI would write.

Tann
09-05-2025, 11:27 AM
I wouldn't put it past my former self to have used chatgpt originally, the websites I frequent are often slow to respond so I had to include a lot of wait timers and try/catches so the code wouldn't crap itself.

loramin
09-05-2025, 11:46 AM
I wouldn't put it past my former self to have used chatgpt originally, the websites I frequent are often slow to respond so I had to include a lot of wait timers and try/catches so the code wouldn't crap itself.

Welcome to Selenium :)

If you're doing a lot of web automation, I'd highly recommend upgrading to Playwright: it's a lot better (Cypress is pretty good/popular too, but I prefer Playwright as Cypress has these weird non-promise promise things that drive me nuts).

patlefort
09-05-2025, 11:53 AM
Scraping a website is a bad idea and should be used only as a last resort. Any changes to the website can break it. It's not that it's complicated but an approved and stable way to do it by the website is obviously better.