You can get all this using the famous 5 line of code example:
https://docs.llamaindex.ai/en/stable/getting_started/starter_example/You can optimize the prompt to provide answer in required format.
Now for showing the relevant source from which answer is generated, You can check the source nodes ( This contains the nodes which are used to form the final answer ).
Do this
print(response.source_nodes)
and you will find a list of nodes used to generate the answer. You can then show it in the format you want