New feature: Sermon Assistant

135

Comments

  • Jonathan Bradley
    Jonathan Bradley Member Posts: 921 ✭✭✭

    I didn't think of which tier subscription that Sacha has.

    Pastor, Mt. Leonard Baptist Church, SBC

  • Aaron Zeller
    Aaron Zeller Member Posts: 1

    Are you also planning to integrate a kind of rewriting function that helps the author of a sermon to formulate the sentences differently and more comprehensibly? Applications such as DeepL Write or others have this. 

  • Anstey Jeremiah
    Anstey Jeremiah Member Posts: 12

    I think Logos should keep their original ideas about the AI Assistant. And I see lazy people, people I suspect are not preaching or teaching personnel at all or people who are more concerned about how much spare time they can get through using ai. Ihsve to say I don’t know much about ai but I don want it to know me too well either. we can get what we want from ai by posting different question /themes or questions/ themes differentl. it would help if one print these out. So far in my use ai hasn’t produced any erroneous answers as far as the sermon assistant is concerned; I can’t say the same for the search assistant. I asked what kind of sin Adam committed, parabasis, or hamaRita and I get some weird answers, maybe it doesn’t recognize Greek transliteration; that’s something you can work on. I would also like to put the generated question results in the handout and not in the body of the sermon. Ai has a hidden feature that is being ignored, I.e. it gives ideas for something you didn’t finish or ideas for ne sermons or Bible classes. For instance it answered my first request for one thing  “ overcoming temptation,” and lo and behold the second generation was on repentanc, I guess based on the scriptures I inserted. That is another important aspect, one needs to put in  much “suspected “ scriptures and save the  ai generated help in  maybe MS Word. Last thing ai works best in Old sermons especially the Application section. you can use the theme at the beginning of the sermon and the applications to modify your endings. Thank you Logos, keep me focusing on the learning aspect of teaching and preaching

  • Anstey Jeremiah
    Anstey Jeremiah Member Posts: 12

    Hey, we still also have to pray before any project; and our pray should include asking God to help us use ai before we reach that place where ai does all the work, especially when we stop fact checking it. 

  • MJ. Smith
    MJ. Smith Member, MVP Posts: 53,030 ✭✭✭✭✭

    Are you also planning to integrate a kind of rewriting function that helps the author of a sermon to formulate the sentences differently and more comprehensibly? Applications such as DeepL Write or others have this. 

    Welcome to the forums ... Not disagreeing with you but offering some food for thought.  Given that a student has already been charged with plagiarism simply because she depended too heavily on Grammarly (see Student put on probation for using Grammarly: 'AI violation' (nypost.com)) do we really want the AI writing within Logos or do we want to make it a user conscious choice requiring they move to another tool?

    Orthodox Bishop Alfeyev: "To be a theologian means to have experience of a personal encounter with God through prayer and worship."; Orthodox proverb: "We know where the Church is, we do not know where it is not."

  • Adam Roe
    Adam Roe Member Posts: 3

    A feature I tend to use AI for in my writing is something like a feedback assistant. Instead of asking it to be the writer, I do the writing and ask it for feedback on how to improve what I've written. I wonder if something like this might be available for the sermon editor.

  • Stephan Thelen
    Stephan Thelen Member Posts: 590

    A feature I tend to use AI for in my writing is something like a feedback assistant. Instead of asking it to be the writer, I do the writing and ask it for feedback on how to improve what I've written. I wonder if something like this might be available for the sermon editor.

    Interesting Idea. But what should be the criteria to improve the message? Structure? Easy wording? Length of the sentences? Content?

  • Stephan Thelen
    Stephan Thelen Member Posts: 590

    One thing that I really miss is an IA Filter depending on my sermon which search a good illustration from my ressources. The function which is now included is to simple. The assistants deliver to much results so that I have to read to many stupid illustrations. The both included functions doesn’t realy help…

  • Joshua Keatts
    Joshua Keatts Member Posts: 5

    Is there a way to have the sermon assistant access and utilize your favorite version of the Bible?

  • Pastor David
    Pastor David Member Posts: 6

    I have become adapt at using models of ChatGPT to create sermon titles, ideas, and even outlines just as the AI Sermon Assistant does. One of the things I like vs. Logos use is the ability to come back to previous generations for review and usage. Logos doesn't maintain any previous AI generations. Here's an example of why. I'm working on my laptop which has a touch screen. I use my finger to scroll through the response that was generated. Wouldn't you know it...I miss the touch area are cause what was generated to go away. I can't get it back. And as is the case with AI, when I run the generate again, it is something entirely different. I can't get back to where I was. Very frustrating.

  • Michael Kaluza
    Michael Kaluza Member Posts: 1

    Hello, the questions generated by the AI are always the same. I have copied in Mark 11:1-11 as the sermon text. I wanted to get theological questions. But no matter which option I choose, I always get the same questions.

    Compare the meaning of repentance in today's society with the meaning of repentance according to the Bible.

    How can we apply the teachings from the sermon on loving our neighbor in our daily lives?

    Why do you think Jesus used parables to spread his message instead of teaching directly?

    What passages from the Bible were quoted in the sermon and how do they help us to better understand the topic?

    What concrete steps can we take to live up to the calls for reconciliation in the sermon?

    Not a single question really addresses the text theologically. What does Mark 11:1-11 have to do with repentance or reconciliation?

    Unfortunately, I cannot yet see the benefit of this function.

  • Mark Barnes (Logos)
    Mark Barnes (Logos) Administrator, Logos Employee Posts: 1,888

    Hello, the questions generated by the AI are always the same. I have copied in Mark 11:1-11 as the sermon text. I wanted to get theological questions. But no matter which option I choose, I always get the same questions.

    Compare the meaning of repentance in today's society with the meaning of repentance according to the Bible.

    How can we apply the teachings from the sermon on loving our neighbor in our daily lives?

    Why do you think Jesus used parables to spread his message instead of teaching directly?

    What passages from the Bible were quoted in the sermon and how do they help us to better understand the topic?

    What concrete steps can we take to live up to the calls for reconciliation in the sermon?

    Not a single question really addresses the text theologically. What does Mark 11:1-11 have to do with repentance or reconciliation?

    Unfortunately, I cannot yet see the benefit of this function.

    The Questions function of sermon assistant is designed to create questions on sermons, not on biblical passages. Nonetheless, it looks like we have a bug in that some passage blocks are being ignored, and therefore you're only getting generic questions.

    As a temporary workaround, if you edit your passage block by clicking on the pencil, you'll then be able to get some reasonable questions on the passage.

  • I have the Pro subscription, but in Sermon Assistant I do not have the "outline" button. I only have options for Illustrations, Questions, and Applications. Is anyone else having this issue?

  • Ben Leney
    Ben Leney Member Posts: 11

    Hi I've just been trying out the Questions feature in the Sermon Assistant.  Loving, by the way, the Sermon Outline function.  but where you can select 'children' for the Questions, the wording of the questions is far too complex to lift these and use for example in a quiz after a talk.  Is there any way that for the 'Children' the wording of the questions can be pitched at their level?

    An example of a question I got based on a sermon outline generated on 'How God saves us' 

    Explain the concept of justification by faith in relation to God's unfailing grace, as taught in the sermon. How does this concept help children comprehend the unconditional nature of God's love?

    What would be great would be if the question were pitched at the child and in childlike language: 'What does justification by faith actually mean?' (not the best version of this, hence I'd love more help!)

  • Mark Barnes (Logos)
    Mark Barnes (Logos) Administrator, Logos Employee Posts: 1,888

    I have the Pro subscription, but in Sermon Assistant I do not have the "outline" button. I only have options for Illustrations, Questions, and Applications. Is anyone else having this issue?

    Can you check that your version of Logos is up-to-date?

  • Mattillo
    Mattillo Member Posts: 6,114 ✭✭✭

    I have the Pro subscription, but in Sermon Assistant I do not have the "outline" button. I only have options for Illustrations, Questions, and Applications. Is anyone else having this issue?

    Can you check that your version of Logos is up-to-date?

    Another thought... try the web app and go to sermon builder. If it is there and not on your desktop, then you can narrow down the issue. 

  • Buck McDaniel
    Buck McDaniel Member Posts: 28

    I'm impressed. I'm just playing around with and learning about it. I haven't done sermons, but I'm following along MPs video.  I don't know where else the Sermon Assistant tools work, but this is great for finding quotes and illustrations for papers or for my commentary.

    I'm looking forward to learning more.

    Shabbat Shalom

  • Will Butterfield
    Will Butterfield Member Posts: 12

    It would be super helpful to mark the question (color change, star, check, etc.)in the list when it is added to the sermon—something to denote that I have already inserted it. 

  • Renold Burke
    Renold Burke Member Posts: 72

    Suggestion:

    Letting Sermon assistance generates from resources I choose would be great. Now it's picking and choosing from all resources; also great, but sometimes I have a specific book in mind. Or am I missing something?

    Titles impress people, obedience impresses God.

  • Renold Burke
    Renold Burke Member Posts: 72

    Suggestion:

    Letting Sermon assistance generates from resources I choose would be great. Now it's picking and choosing from all resources; also great, but sometimes I have a specific book in mind. Or am I missing something?

    Titles impress people, obedience impresses God.

  • Anstey Jeremiah
    Anstey Jeremiah Member Posts: 12

    If you choose "books" instead of all, you get a line open to you that says "your books." When you click on that a drop down lists open up and a space where you can type a specific book. 

  • Graham Criddle
    Graham Criddle Member, MVP Posts: 32,427 ✭✭✭

    Hi Anstey

    If you choose "books" instead of all, you get a line open to you that says "your books." When you click on that a drop down lists open up and a space where you can type a specific book. 

    This works in a Books Search, but I don't believe it is available in the Sermon Assistant.

    Graham

  • Rev. Delwyn X. Campbell Sr
    Rev. Delwyn X. Campbell Sr Member Posts: 10 ✭✭

    Can this bew adjusted to fit your theological perspective? I am a Lutheran (LCMS). How can I avoid getting AI input that would be contrary to Lutheran teaching?

  • Rev. Delwyn X. Campbell Sr
    Rev. Delwyn X. Campbell Sr Member Posts: 10 ✭✭

    Can this be adjusted to fit your theological perspective? I am a Lutheran (LCMS). How can I avoid getting AI input that would be contrary to Lutheran teaching?

  • Graham Criddle
    Graham Criddle Member, MVP Posts: 32,427 ✭✭✭

    Can this bew adjusted to fit your theological perspective? I am a Lutheran (LCMS). How can I avoid getting AI input that would be contrary to Lutheran teaching?

    This is not currently possible in the Sermon Assistant - and I don't know if there are any plans to provide it.

    On a related topic, if you are using a Books Search for an AI-assisted Smart Search, you can restrict the search results to a particular subset of books (such as a collection) that would enable the functionality you are looking for there.

  • Jim Dean
    Jim Dean Member Posts: 312 ✭✭

    I have been surprised by the disparity between the sophistication of ChatGPT, or meta-AI, in comparison to the current Logos offerings. I understand something of the complexity involved in the task, but nonetheless, I have found that tasks, such as those undertaken by sermon assistantusually are far better handled by ChatGPT, then by the native Logos AI. I am hoping that will be dramatically improved in the not too distant future.

     Usually are far better handled by ChatGPT, then by the native Logos AI. I am hoping that will be dramatically improved in the not too distant future.

    Because of my curiosity about this, I posed a question to ChatGPT about it. I got a very interesting and quite detailed reply. my question was:

    My question was:  I am interested in examining a hypothetical integration. I own over 16,000 theological books in the Logos environment. If somehow access to all of those books was made available to the ChatGPT engine, could ChatGPT be directed to answer all of its queries from a given person only using the sources from those books?  …  here is the reply …


    Integrating ChatGPT with Logos Bible Software to enable ChatGPT to answer queries exclusively using your extensive library of over 16,000 theological books is an intriguing idea. While such an integration doesn't exist natively, it's theoretically feasible through a combination of available technologies and custom development. Here's an overview of how this could be approached, along with the considerations and challenges involved:


    ### **1. Understanding the Scope**


    - **Volume of Data**: Managing and processing 16,000 theological books is a substantial undertaking. Ensuring that the AI can efficiently access and retrieve relevant information from such a vast repository is critical.

      

    - **Content Licensing**: Verify that you have the rights to use and process the content from these books, especially if you plan to use them in a way that might involve redistribution or public access.


    ### **2. Potential Integration Approaches**


    #### **a. Retrieval-Augmented Generation (RAG)**

       

    - **How It Works**: RAG combines traditional retrieval systems with language models. When a user poses a question, the system first retrieves relevant documents from the Logos library and then uses ChatGPT to generate a response based solely on those documents.


    - **Implementation Steps**:

      1. **Data Indexing**: Convert your Logos library into a searchable format. This might involve exporting texts from Logos and storing them in a database optimized for quick retrieval.

      

      2. **Building a Retrieval System**: Use tools like Elasticsearch or vector databases (e.g., Pinecone, FAISS) to index the content and enable efficient search operations.

      

      3. **Connecting with ChatGPT**: Develop a middleware application that takes user queries, retrieves relevant excerpts from the indexed data, and feeds them into ChatGPT as context for generating responses.


    - **Advantages**:

      - Ensures responses are grounded in your specific set of resources.

      - Scalable with proper infrastructure.


    - **Challenges**:

      - Requires significant technical expertise to set up and maintain.

      - Ensuring the retrieval system accurately surfaces the most relevant and authoritative sources.


    #### **b. Fine-Tuning or Custom Training**


    - **How It Works**: Fine-tune a language model like ChatGPT on your specific corpus so that its responses are influenced directly by the content of your 16,000 books.


    - **Implementation Steps**:

      1. **Data Preparation**: Organize and preprocess the texts from Logos to create a training dataset.

      

      2. **Fine-Tuning Process**: Use OpenAI’s fine-tuning APIs or other machine learning frameworks to train the model on your data.

      

      3. **Deployment**: Host the fine-tuned model in an environment where it can handle queries and generate responses based solely on its training.


    - **Advantages**:

      - Responses can be highly tailored to the specific content of your library.

      

    - **Challenges**:

      - OpenAI's current offerings may limit the extent to which large proprietary datasets can be used for fine-tuning.

      - Fine-tuning with such a large corpus (16,000 books) might be resource-intensive and costly.

      - Maintaining and updating the model as new books are added or existing ones are updated.


    #### **c. API-Based Integration**


    - **How It Works**: Utilize Logos' APIs (if available) to fetch relevant content dynamically and use ChatGPT to process and present the information.


    - **Implementation Steps**:

      1. **API Access**: Ensure that Logos provides APIs that allow programmatic access to its content.

      

      2. **Middleware Development**: Create an application that handles user queries, interacts with Logos APIs to retrieve relevant passages, and then uses ChatGPT to formulate responses based on those passages.

      

      3. **User Interface**: Develop a user-friendly interface where queries can be entered and responses displayed seamlessly.


    - **Advantages**:

      - Leverages existing Logos infrastructure for content retrieval.

      - Potentially more straightforward than building a custom retrieval system from scratch.


    - **Challenges**:

      - Dependence on the capabilities and limitations of Logos’ APIs.

      - Ensuring real-time performance and relevance of retrieved content.


    ### **3. Technical and Practical Considerations**


    - **Infrastructure Requirements**: Handling and processing a large volume of data requires robust infrastructure, including storage solutions, processing power, and efficient databases.


    - **Latency and Performance**: Ensuring that responses are generated promptly, especially when dealing with real-time queries and large datasets, is crucial for user experience.


    - **Data Privacy and Security**: Safeguard the theological works and any user data involved in the process, complying with relevant data protection regulations.


    - **Maintenance and Updates**: Regularly updating the system to incorporate new books, revisions, and maintaining the integration's health over time.


    ### **4. Alternative Solutions and Tools**


    - **Knowledge Management Platforms**: Explore platforms that specialize in integrating large document repositories with AI capabilities. Tools like **Haystack** or **LangChain** can facilitate building custom AI-powered search and response systems.


    - **Custom Development Services**: If the technical aspects are beyond your expertise, consider collaborating with software developers or companies specializing in AI integrations to build a bespoke solution.


    ### **5. Leveraging OpenAI's Offerings**


    - **OpenAI API**: Utilize OpenAI’s API to build the conversational interface, ensuring that it communicates effectively with your retrieval system.


    - **Embeddings**: Use OpenAI’s embedding models to convert your texts into vector representations, enabling efficient semantic searches and relevance matching.


    - **Compliance and Usage Policies**: Ensure that your use case complies with OpenAI’s usage policies, especially concerning data handling and content generation based on proprietary materials.


    ### **6. Potential Benefits**


    - **Enhanced Research Capabilities**: Seamlessly querying a vast theological library can significantly aid in research, sermon preparation, and theological studies.


    - **Consistency and Reliability**: Responses grounded in your curated set of resources ensure theological accuracy and consistency with your specific doctrinal positions.


    - **Efficiency**: Automating the retrieval and summarization of relevant content can save considerable time compared to manual searching and reading.


    ### **7. Challenges and Limitations**


    - **Complexity of Integration**: Combining two sophisticated systems like ChatGPT and Logos requires substantial technical effort and expertise.


    - **Quality Control**: Ensuring that the AI responses are not only relevant but also accurate and contextually appropriate based on the source material.


    - **Scalability**: As your library grows or as more users interact with the system, maintaining performance and relevance can become increasingly challenging.


    ### **8. Conclusion**


    While integrating ChatGPT with Logos Bible Software to utilize your extensive theological library is complex, it is achievable with the right technical approach and resources. This integration could provide a powerful tool for theological study and discourse, offering tailored, resource-backed responses to a wide range of queries. To move forward, you might consider:


    - **Assessing Technical Requirements**: Evaluate the technical feasibility based on your resources and expertise.

      

    - **Consulting with Experts**: Engage with software developers or AI specialists who have experience with similar integrations.

      

    - **Exploring APIs and Tools**: Investigate the available APIs from Logos and OpenAI, as well as tools that facilitate document retrieval and AI interaction.


    By carefully planning and executing the integration, you can create a robust system that leverages the strengths of both Logos and ChatGPT to enhance your theological work.

    =============
    Redeeming the time (Eph.5:16+Col.4:5) ... Win 10, iOS & iPadOS 16
    Jim Dean

  • Jim Dean
    Jim Dean Member Posts: 312 ✭✭

    I have been surprised by the disparity between the sophistication of ChatGPT, or meta-AI, in comparison to the current Logos offerings. I understand something of the complexity involved in the task, but nonetheless, I have found that tasks, such as those undertaken by sermon assistantusually are far better handled by ChatGPT, then by the native Logos AI. I am hoping that will be dramatically improved in the not too distant future.

     Usually are far better handled by ChatGPT, then by the native Logos AI. I am hoping that will be dramatically improved in the not too distant future.

    Because of my curiosity about this, I posed a question to ChatGPT about it. I got a very interesting and quite detailed reply. my question was:

    My question was:  I am interested in examining a hypothetical integration. I own over 16,000 theological books in the Logos environment. If somehow access to all of those books was made available to the ChatGPT engine, could ChatGPT be directed to answer all of its queries from a given person only using the sources from those books?  …  here is the reply …  (the original formatting still seems to be there, if I try to edit this text, but I’m not sure it will show up cleanly to other readers. If so, the editor feature of this forum needs to be improved.)


    Integrating ChatGPT with Logos Bible Software to enable ChatGPT to answer queries exclusively using your extensive library of over 16,000 theological books is an intriguing idea. While such an integration doesn't exist natively, it's theoretically feasible through a combination of available technologies and custom development. Here's an overview of how this could be approached, along with the considerations and challenges involved:


    ### **1. Understanding the Scope**


    - **Volume of Data**: Managing and processing 16,000 theological books is a substantial undertaking. Ensuring that the AI can efficiently access and retrieve relevant information from such a vast repository is critical.

      

    - **Content Licensing**: Verify that you have the rights to use and process the content from these books, especially if you plan to use them in a way that might involve redistribution or public access.


    ### **2. Potential Integration Approaches**


    #### **a. Retrieval-Augmented Generation (RAG)**

       

    - **How It Works**: RAG combines traditional retrieval systems with language models. When a user poses a question, the system first retrieves relevant documents from the Logos library and then uses ChatGPT to generate a response based solely on those documents.


    - **Implementation Steps**:

      1. **Data Indexing**: Convert your Logos library into a searchable format. This might involve exporting texts from Logos and storing them in a database optimized for quick retrieval.

      

      2. **Building a Retrieval System**: Use tools like Elasticsearch or vector databases (e.g., Pinecone, FAISS) to index the content and enable efficient search operations.

      

      3. **Connecting with ChatGPT**: Develop a middleware application that takes user queries, retrieves relevant excerpts from the indexed data, and feeds them into ChatGPT as context for generating responses.


    - **Advantages**:

      - Ensures responses are grounded in your specific set of resources.

      - Scalable with proper infrastructure.


    - **Challenges**:

      - Requires significant technical expertise to set up and maintain.

      - Ensuring the retrieval system accurately surfaces the most relevant and authoritative sources.


    #### **b. Fine-Tuning or Custom Training**


    - **How It Works**: Fine-tune a language model like ChatGPT on your specific corpus so that its responses are influenced directly by the content of your 16,000 books.


    - **Implementation Steps**:

      1. **Data Preparation**: Organize and preprocess the texts from Logos to create a training dataset.

      

      2. **Fine-Tuning Process**: Use OpenAI’s fine-tuning APIs or other machine learning frameworks to train the model on your data.

      

      3. **Deployment**: Host the fine-tuned model in an environment where it can handle queries and generate responses based solely on its training.


    - **Advantages**:

      - Responses can be highly tailored to the specific content of your library.

      

    - **Challenges**:

      - OpenAI's current offerings may limit the extent to which large proprietary datasets can be used for fine-tuning.

      - Fine-tuning with such a large corpus (16,000 books) might be resource-intensive and costly.

      - Maintaining and updating the model as new books are added or existing ones are updated.


    #### **c. API-Based Integration**


    - **How It Works**: Utilize Logos' APIs (if available) to fetch relevant content dynamically and use ChatGPT to process and present the information.


    - **Implementation Steps**:

      1. **API Access**: Ensure that Logos provides APIs that allow programmatic access to its content.

      

      2. **Middleware Development**: Create an application that handles user queries, interacts with Logos APIs to retrieve relevant passages, and then uses ChatGPT to formulate responses based on those passages.

      

      3. **User Interface**: Develop a user-friendly interface where queries can be entered and responses displayed seamlessly.


    - **Advantages**:

      - Leverages existing Logos infrastructure for content retrieval.

      - Potentially more straightforward than building a custom retrieval system from scratch.


    - **Challenges**:

      - Dependence on the capabilities and limitations of Logos’ APIs.

      - Ensuring real-time performance and relevance of retrieved content.


    ### **3. Technical and Practical Considerations**


    - **Infrastructure Requirements**: Handling and processing a large volume of data requires robust infrastructure, including storage solutions, processing power, and efficient databases.


    - **Latency and Performance**: Ensuring that responses are generated promptly, especially when dealing with real-time queries and large datasets, is crucial for user experience.


    - **Data Privacy and Security**: Safeguard the theological works and any user data involved in the process, complying with relevant data protection regulations.


    - **Maintenance and Updates**: Regularly updating the system to incorporate new books, revisions, and maintaining the integration's health over time.


    ### **4. Alternative Solutions and Tools**


    - **Knowledge Management Platforms**: Explore platforms that specialize in integrating large document repositories with AI capabilities. Tools like **Haystack** or **LangChain** can facilitate building custom AI-powered search and response systems.


    - **Custom Development Services**: If the technical aspects are beyond your expertise, consider collaborating with software developers or companies specializing in AI integrations to build a bespoke solution.


    ### **5. Leveraging OpenAI's Offerings**


    - **OpenAI API**: Utilize OpenAI’s API to build the conversational interface, ensuring that it communicates effectively with your retrieval system.


    - **Embeddings**: Use OpenAI’s embedding models to convert your texts into vector representations, enabling efficient semantic searches and relevance matching.


    - **Compliance and Usage Policies**: Ensure that your use case complies with OpenAI’s usage policies, especially concerning data handling and content generation based on proprietary materials.


    ### **6. Potential Benefits**


    - **Enhanced Research Capabilities**: Seamlessly querying a vast theological library can significantly aid in research, sermon preparation, and theological studies.


    - **Consistency and Reliability**: Responses grounded in your curated set of resources ensure theological accuracy and consistency with your specific doctrinal positions.


    - **Efficiency**: Automating the retrieval and summarization of relevant content can save considerable time compared to manual searching and reading.


    ### **7. Challenges and Limitations**


    - **Complexity of Integration**: Combining two sophisticated systems like ChatGPT and Logos requires substantial technical effort and expertise.


    - **Quality Control**: Ensuring that the AI responses are not only relevant but also accurate and contextually appropriate based on the source material.


    - **Scalability**: As your library grows or as more users interact with the system, maintaining performance and relevance can become increasingly challenging.


    ### **8. Conclusion**


    While integrating ChatGPT with Logos Bible Software to utilize your extensive theological library is complex, it is achievable with the right technical approach and resources. This integration could provide a powerful tool for theological study and discourse, offering tailored, resource-backed responses to a wide range of queries. To move forward, you might consider:


    - **Assessing Technical Requirements**: Evaluate the technical feasibility based on your resources and expertise.

      

    - **Consulting with Experts**: Engage with software developers or AI specialists who have experience with similar integrations.

      

    - **Exploring APIs and Tools**: Investigate the available APIs from Logos and OpenAI, as well as tools that facilitate document retrieval and AI interaction.


    By carefully planning and executing the integration, you can create a robust system that leverages the strengths of both Logos and ChatGPT to enhance your theological work.

    =============
    Redeeming the time (Eph.5:16+Col.4:5) ... Win 10, iOS & iPadOS 16
    Jim Dean

  • MJ. Smith
    MJ. Smith Member, MVP Posts: 53,030 ✭✭✭✭✭

    Sorry, but I am firmly on the go slow approach to the use of AI despite my heavy use of it in my TIP of the Day series. I will continue in this position until the workers in companies producing AI products and the researchers providing the framework for AI products have a similar view of what has been achieved.

    I have been surprised by the disparity between the sophistication of ChatGPT, or meta-AI, in comparison to the current Logos offerings.

    That is because they have very different goals. So far, and I hope into the distant future, Logos is taking a conservative approach to the use of AI. This limits the hallucinations - I use Perplexity because it has a much lower rate of hallucinations than ChatGPT - and minimizes doctrinal bias. Essentially, neural networks lack judgement when it comes to alternative understandings. Furthermore, the information science behind the chatbots knows that they degrade/self-implode when too much of their own work is fed back through as training materials The Disinformation Dilemma: The Inevitable Degradation of AI Chatbot Data | by The Brykman Predicament | Geek Culture | Medium. From my personal perspective, the primary purpose of Bible study is to change me through an intimate conversation with God -- what I often refer to as reading a love letter. AI is currently far from the wise mentor needed to enhance that.

    I asked Perplexity:"Do LLMs and chatbot technology have the capability to train on 100,000 theology books then provide the logical inconsistencies of particular positions on topics and identify the presuppositions inherent in those positions?"

    To which it replies: [quote]

    Large language models (LLMs) and chatbots have significant capabilities when it comes to analyzing large volumes of text and identifying patterns, but there are some important considerations regarding their ability to deeply analyze theological positions:

    Capabilities:

    1. LLMs can be trained on vast amounts of theological literature, potentially including 100,000+ books.

    2. They can identify common themes, arguments, and terminology across different theological positions.

    3. They can compare and contrast different viewpoints on theological topics.

    4. They can potentially identify logical inconsistencies within individual arguments or between related positions.

    5. LLMs may be able to surface some of the underlying assumptions or presuppositions behind theological stances.

    Limitations:

    1. LLMs do not truly "understand" theology in a human sense - their analysis is based on statistical patterns in text.

    2. They may struggle with nuanced interpretations and the full context behind complex theological positions.

    3. Identifying logical inconsistencies in abstract theological arguments requires sophisticated reasoning that current LLMs may not fully achieve.

    4. Surfacing implicit presuppositions, especially those deeply embedded in a theological worldview, is challenging even for human experts.

    5. LLMs can be biased based on their training data and may not fairly represent all theological perspectives.

    6. They lack the ability to engage in original theological reasoning or critique beyond what is present in their training data.

    While LLMs are powerful tools for analyzing large bodies of theological literature, their ability to deeply critique theological positions or uncover hidden presuppositions is likely still limited compared to expert human theologians. They are best used as assistive tools to aid human analysis rather than as standalone arbiters of theological consistency or validity.

    Citations:
    [1] https://thethink.institute/articles/presuppositions
    [2] https://globalpastorinstitute.com/presuppositions-in-theology-by-paul-wright-ph-d/?v=36b03dc28b10
    [3] https://www.reddit.com/r/theology/comments/1b8hp44/anyone_know_how_to_articulate_the_difference_of/
    [4] https://answersingenesis.org/presuppositions/presuppositions/
    [5] https://cbtseminary.org/5-presuppositions-of-apologetics-dewey-dovel/
    [6] https://www.bibletools.org/index.cfm/fuseaction/topical.show/RTD/cgg/ID/557/Name.htm
    [7] https://christoetdoctrinae.com/articles/av2c9sy5houybcx04mdz1vzs8nbqyc
    [8] https://www.breakerministries.com/blog/2019/5/10/the-prophetic-potential-hidden-in-a-name

    Orthodox Bishop Alfeyev: "To be a theologian means to have experience of a personal encounter with God through prayer and worship."; Orthodox proverb: "We know where the Church is, we do not know where it is not."

  • xnman
    xnman Member Posts: 2,775 ✭✭✭

    I think I'll stick with Claude 2 by Anthropic... for now anyway.  I predict by the end of the year there will be many more "A.I." programs... It seems inevitable to me with everyone and their brothers all getting into A.I.

    As far as use with a bible program like Logos... I appreciate Logos going slow and being conservative about the whole matter... personally.

    xn = Christan  man=man -- Acts 11:26 "....and the disciples were first called Christians in Antioch".

    Barney Fife is my hero! He only uses an abacus with 14 rows!