Scraping Logos website for "Dynamic Sale Price" and "Dynamic Collection Value" of Logos packages

1Cor10 31
1Cor10 31 Member Posts: 791 ✭✭✭
edited November 2024 in English Forum

Hi everyone,

Is there a way to pull the "Dynamic Sale Price" and "Dynamic Collection Value" for all the Logos 10 packages into an Excel file automatically (instead of inputting them one by one)?

In the first column, I could write down the name of Logos packages in the first column

In the second column, I could give the weblink for the Logos package.

In the third column, I would like a query of some sort that will use the weblink in Column 2 to grab the Dynamic Sale Price and drop it into Column 3.

In the fourth column, I would like to use a similar query that will use the weblink in Column 2 to grab now the Dynamic Collection Value and drop it into Column 4.

Is the above possible? If so, any tips?

Thanks

I believe in a Win-Win-Win God.

Comments

  • Wolfgang Schneider
    Wolfgang Schneider Member Posts: 679 ✭✭✭

    1Cor10 31 said:

    Is the above possible? If so, any tips?

    I don't think this is possible ... because the "Dynamic pricing" is dependent on what the person ordering a package already owns in books and features (my current understanding from various previous communications about "you don't pay for what you already own"). So then, 5 customers with different currently owned books/features could have a different "dynamic price" when purchasing the same item/package.

    Wolfgang Schneider

    (BibelCenter)

  • 1Cor10 31
    1Cor10 31 Member Posts: 791 ✭✭✭

    Thank you Wolfgang.

    I thought that the whole point of scraping is that the code just picks the number from the website. So if a different person uses the code, the person's own dynamic price should be picked up by the code. 

    Bottom line: As a non-tech person, I am not able to see why the fact that the price differs based on the person should prevent us from extract the data. You can simply say - that's the way it works  - and I'm happy to move on from this.

    I believe in a Win-Win-Win God.

  • DMB
    DMB Member Posts: 14,209 ✭✭✭✭

    Like Wolfgang, not an expert. But your scraping has allow the scripts to fill out the page. Chances are, you're looking at the html delivery? The other issue I run into, is I turn off images, and quite a few page writes sit on images.

    "If myth is ideology in narrative form, then scholarship is myth with footnotes." B. Lincolm 1999.

  • Manuel R.
    Manuel R. Member Posts: 368 ✭✭✭

    1Cor10 31 said:

    I thought that the whole point of scraping is that the code just picks the number from the website. So if a different person uses the code, the person's own dynamic price should be picked up by the code. 

    That's definitely possible with the important remark, that you would need to give the script/crawler the Login information or a valid login token to be able to "see" the corresponding dynamic price and scrape the data.
  • 1Cor10 31
    1Cor10 31 Member Posts: 791 ✭✭✭

    Hi DMB and Manuel,

    Would any of you know the script I have to write? Or point to where I might be able to read and write myself (I'm doubtful I have the skill, but I'll give it a try).

    Thanks

    I believe in a Win-Win-Win God.

  • Manuel R.
    Manuel R. Member Posts: 368 ✭✭✭

    I know it and could do it but I am sorry I don't have the time to do so. As a developer I am pretty busy. I don't know whether it makes much sense to try it and not understand web-programming / html. You would need python and for example selenium and an html parser like beautiful soup. There are ready to use (often paid) web-crawler / scraper but you would need to configure them for this special use case.

  • 1Cor10 31
    1Cor10 31 Member Posts: 791 ✭✭✭

    Manuel R. said:

    I know it and could do it but I am sorry I don't have the time to do so. As a developer I am pretty busy. I don't know whether it makes much sense to try it and not understand web-programming / html. You would need python and for example selenium and an html parser like beautiful soup. There are ready to use (often paid) web-crawler / scraper but you would need to configure them for this special use case.

    Thank you Manuel for responding. I thought it will be a line of code to pull something out like that. You've given me enough pointers to try and experiment.

    I believe in a Win-Win-Win God.

  • abondservant
    abondservant Member Posts: 4,796 ✭✭✭

    When we used to fill out a spreadsheet with all the base package info, we did it all by hand. 


    L2 lvl4 (...) WORDsearch, all the way through L10,

  • 1Cor10 31
    1Cor10 31 Member Posts: 791 ✭✭✭

    I did that too after Logos 10 release, but I was trying to automate this.

    I believe in a Win-Win-Win God.